1 0
Read Time:2 Minute, 30 Second

dbt Tests in Snowflake: Override Failure Tables + Email Alerts: Imagine you’re running a retail analytics pipeline on Snowflake. Every night, customer orders are loaded from multiple source systems into your Orders table. On one day, business users report that the dashboard is showing negative order amounts and orders dated in the future. Clearly, bad data has slipped through.

You already have dbt tests in place to catch these issues, But two challenges come up when you start using them in production with Snowflake:

  1. Ugly failure tables:
    With store_failures: true, dbt persists failing rows, but names the table something like:Not exactly human-friendly.       dbt_test__audit.accepted_values_orders_status__shipped__completed__pending
  2. Silent failures:
    Tests may fail quietly. Bad rows are logged, but nobody knows unless they manually query the audit schema or parse run logs.

Both problems make it harder to operationalize data quality in Snowflake.

 The Goal

We built a framework around dbt + Snowflake that does two things:

  1. Override failure table names with meaningful aliases.
  2. Automatically send email alerts when tests fail — with no manual dependency.

Implementation:

Implementation:

{% test value_between(model, column_name, min_value, max_value) %}
{{ config(
store_failures = true, -- persist bad rows
schema = 'AUDIT', -- central audit schema
alias = 'ORDERS_AMOUNT_RANGE_FAILS' -- readable failure table
) }}
select
order_id,
customer_id,
{{ column_name }} as invalid_amount,
current_timestamp as test_run_time
from {{ model }}
where {{ column_name }} < {{ min_value }}
or {{ column_name }} > {{ max_value }}
{% endtest %}

Now instead of an opaque table name, failures logs into: AUDIT.ORDERS_AMOUNT_RANGE_FAILS

This makes it much easier to monitor and trace.

Step 2: Add a Smart Notification Macro

Next, we created a macro that:

  • Checks if dbt tests were actually run.
  • Queries the failure table for new rows in the last 10 minutes.
  • If rows exist → sends an email using Snowflake’s SYSTEM$SEND_EMAIL.
Macro

Step 3: Hook into dbt Lifecycle

We attach this macro to the dbt lifecycle so it runs automatically at the end of a job:

# dbt_project.yml

on-run-end:

- "{{ notify_on_failure('ORDERS_AMOUNT_RANGE_FAILS') }}"

When we ran dbt test, the following happened:

  1. Failures were stored in AUDIT.ORDERS_AMOUNT_RANGE_FAILS.
  2. Macro checked the table and found 2 new rows.
  3. Snowflake sent an email:

Schema.yml:

Schema file

Run the test  against the source:

Dbt run

 

 

 

 

 

 

 

 

 

 

 

 

dbt Test Table
Error Mail

Why This Matters

With just a few lines of dbt + Snowflake config, we solved two real-world issues:

  1. Readable failure tables → easier root-cause analysis.
  2. Real-time notifications → no more silent failures.

This approach turns dbt tests into an actionable data quality framework inside Snowflake.

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *