Skip to content

Fix: dbt Not Working — ref() Not Found, Schema Mismatch, and Compilation Errors

FixDevs ·

Quick Answer

How to fix dbt errors — ref() model not found, profile not found, database relation does not exist, incremental model schema mismatch requiring full-refresh, dbt deps failure, Jinja compilation errors, and test failures.

The Error

Your model references another model but dbt can’t find it:

Compilation Error in model orders_summary (models/marts/orders_summary.sql)
  Model 'model.my_project.stg_orders' not found.

Or dbt can’t connect to your database because it can’t find your profile:

Runtime Error
  Could not find profile named 'my_project'

Or the model ran before but now fails after a schema change:

Database Error in model orders_incremental
  column "new_field" of relation "analytics.orders_incremental" does not exist

Or a test you added blocks deployment:

Failure in test not_null_orders_customer_id (models/staging/schema.yml)
  Got 47 results, configured to fail if != 0

dbt sits between your source data and your analytics layer — it transforms SQL into a graph of models with explicit dependencies. When any node in that graph has an error, everything downstream breaks. This guide covers the root causes and fixes for each failure mode.

Why This Happens

dbt builds a directed acyclic graph (DAG) of models linked by ref() calls. If a model is missing, renamed, or in an excluded directory, every model that references it via ref() fails to compile. The profiles.yml file is dbt’s connection config — its absence or misconfiguration prevents any run from reaching the database. Incremental models are stateful: if you change a model’s schema after it’s been materialized as a table, dbt can’t reconcile the new SQL with the existing table structure unless you force a full rebuild.

Fix 1: ref() / source() Node Not Found

Compilation Error: Model 'model.my_project.stg_orders' not found.
Compilation Error: Source 'source.my_project.raw.orders' not found.

ref() looks up models by filename (without .sql). The most common cause is a typo in the model name or the file living in a directory dbt isn’t scanning.

-- models/marts/orders_summary.sql
-- WRONG — typo in model name
SELECT * FROM {{ ref('stg_order') }}    -- file is stg_orders.sql

-- CORRECT
SELECT * FROM {{ ref('stg_orders') }}

Check where dbt is looking by listing all resolved models:

dbt ls --select *            # All models in the project
dbt ls --select stg_*        # Models starting with stg_

If stg_orders doesn’t appear in dbt ls, the file isn’t being found. Check:

  1. The file is inside one of the paths listed under model-paths in dbt_project.yml (default: ["models"])
  2. The file extension is .sql
  3. The filename matches exactly — dbt is case-sensitive on Linux/macOS

source() requires a source definition in a schema YAML file. If you get a source not found error, you need to declare it:

# models/staging/schema.yml
version: 2

sources:
  - name: raw                      # First arg to source()
    database: my_database          # Optional — overrides target database
    schema: raw_data               # Schema in the database
    tables:
      - name: orders               # Second arg to source()
      - name: customers
      - name: products

Then reference it in a model:

-- models/staging/stg_orders.sql
SELECT
    id,
    customer_id,
    order_date,
    total_amount
FROM {{ source('raw', 'orders') }}

Circular references cause compilation to fail with a less obvious error. If model A references model B which references model A, dbt can’t build the graph. Run dbt ls --select +my_model (upstream) and dbt ls --select my_model+ (downstream) to trace dependencies.

Fix 2: Profile Not Found — profiles.yml Configuration

Runtime Error
  Could not find profile named 'my_project'
  Run `dbt debug` to check your connection.

The profiles.yml file holds your database credentials. By default, dbt looks for it at ~/.dbt/profiles.yml. The profile name in this file must match the profile: field in your dbt_project.yml.

Check your project’s expected profile name:

grep "^profile:" dbt_project.yml
# profile: my_project

Create or fix ~/.dbt/profiles.yml:

# ~/.dbt/profiles.yml

my_project:                          # Must match dbt_project.yml profile:
  target: dev
  outputs:
    dev:
      type: postgres                 # postgres | bigquery | snowflake | redshift | duckdb
      host: localhost
      user: dbt_user
      password: "{{ env_var('DBT_PASSWORD') }}"   # Never hardcode passwords
      port: 5432
      dbname: analytics
      schema: dbt_dev                # Your personal dev schema
      threads: 4

    prod:
      type: postgres
      host: prod-db.example.com
      user: dbt_prod
      password: "{{ env_var('DBT_PROD_PASSWORD') }}"
      port: 5432
      dbname: analytics
      schema: analytics              # Shared prod schema
      threads: 8

Test the connection before running models:

dbt debug

dbt debug checks: profile file found, profile name resolves, database connection succeeds, adapter package installed. It prints the exact failure point.

Use --profiles-dir if the file isn’t in the default location:

dbt run --profiles-dir ./config/   # Looks for profiles.yml in ./config/

This is useful in CI/CD environments where you can’t write to ~/.dbt/.

env_var() in profiles reads environment variables at runtime — the safe way to handle credentials:

export DBT_PASSWORD=mysecretpassword
dbt run

In CI, set these as secrets in your CI platform (GitHub Actions, GitLab CI, etc.) and pass them as environment variables to the dbt step.

Fix 3: Database Relation Does Not Exist

Database Error in model orders_summary (models/marts/orders_summary.sql)
  relation "analytics.stg_orders" does not exist
  LINE 1: ... FROM analytics.stg_orders

This error means dbt compiled successfully but the table it’s trying to read from doesn’t exist in the database yet.

The upstream model hasn’t been run. If stg_orders references raw.orders but stg_orders was never materialized as a table, anything that does ref('stg_orders') finds an empty schema:

# Run in dependency order — dbt resolves the order automatically
dbt run                          # Runs all models
dbt run --select stg_orders+     # stg_orders and all downstream models

# Run a specific model and all its parents (upstream dependencies)
dbt run --select +orders_summary  # orders_summary and everything it depends on

Target schema doesn’t exist. dbt uses the schema defined in profiles.yml as the output schema. If that schema doesn’t exist in the database, the first dbt run creates it (for most adapters). If it doesn’t, create it manually:

CREATE SCHEMA IF NOT EXISTS dbt_dev;
GRANT CREATE ON SCHEMA dbt_dev TO dbt_user;
GRANT USAGE ON SCHEMA dbt_dev TO dbt_user;

For PostgreSQL schema permission errors, see PostgreSQL relation does not exist.

Stale models from a renamed file. You renamed stg_orders.sql to staging_orders.sql, but another model still has {{ ref('stg_orders') }}. The compilation resolves to a table that was never built under the new name. Fix: update all ref() calls, then run dbt run to create the model under its new name.

Fix 4: Incremental Model Schema Mismatch — --full-refresh

Database Error in model orders_incremental
  column "new_field" of relation "analytics.orders_incremental" does not exist

Incremental models append or update rows without rebuilding the entire table. When you add, rename, or remove a column from an incremental model’s SQL, the existing table in the database doesn’t automatically change to match. dbt tries to INSERT into the old schema and fails.

The fix: --full-refresh drops and rebuilds the table from scratch:

# Rebuild one specific incremental model
dbt run --select orders_incremental --full-refresh

# Rebuild all incremental models
dbt run --full-refresh

Standard incremental model pattern:

-- models/marts/orders_incremental.sql
{{ config(
    materialized='incremental',
    unique_key='order_id',          -- Used for MERGE/UPSERT on supported adapters
    on_schema_change='append_new_columns'   -- dbt 1.0+: handle schema changes
) }}

SELECT
    order_id,
    customer_id,
    order_date,
    total_amount,
    status
FROM {{ source('raw', 'orders') }}

{% if is_incremental() %}
    -- Only process rows newer than the most recent row in the current table
    WHERE order_date > (SELECT MAX(order_date) FROM {{ this }})
{% endif %}

{{ this }} is a dbt macro that resolves to the current model’s relation (schema + table name). {% if is_incremental() %} is False on the first run (full load) and True on subsequent incremental runs.

on_schema_change config (dbt 1.0+) controls what happens when your SQL adds or removes columns:

{{ config(
    materialized='incremental',
    on_schema_change='append_new_columns'
    -- Options:
    -- 'ignore'              — default; fails if schema changes
    -- 'append_new_columns'  — adds new columns, keeps old ones
    -- 'sync_all_columns'    — adds new, removes deleted (destroys old data)
    -- 'fail'                — raises an error explicitly
) }}

Pro Tip: Add --full-refresh to your staging environment CI job so incremental models are rebuilt from scratch on every PR merge. Save true incrementality for production where the full dataset is too large to rebuild. This catches schema drift before it reaches prod.

Fix 5: dbt deps Failure — Package Installation Errors

Encountered an error:
  Package 'dbt-labs/dbt_utils' was not found in the package index
  Invalid version spec '1.99.0' for package dbt-labs/dbt_utils

dbt packages are installed via dbt deps, which reads packages.yml (dbt Core) or dependencies.yml (dbt 1.6+).

Check compatible versions at hub.getdbt.com — the package must support your dbt Core version:

# packages.yml
packages:
  - package: dbt-labs/dbt_utils
    version: [">=1.0.0", "<2.0.0"]    # Range is safer than pinning to exact

  - package: dbt-labs/codegen
    version: 0.12.0

  - git: https://github.com/org/private-dbt-package.git
    revision: v1.2.0                   # Tag or branch for private repos

dbt deps must be run after any packages.yml change:

dbt deps
# Creates dbt_packages/ directory with installed packages

Check your dbt Core version for compatibility:

dbt --version
# Found config path: /path/to/dbt_project.yml
# Core:
#   - installed: 1.8.3
#   - latest:    1.8.4

Clear and reinstall packages when the dbt_packages/ directory is corrupted:

rm -rf dbt_packages/
dbt deps

Add dbt_packages/ to .gitignore — it’s the equivalent of node_modules:

dbt_packages/
target/
logs/
.user.yml

Fix 6: Test Failures — not_null, unique, accepted_values

Failure in test not_null_orders_customer_id (models/staging/schema.yml)
  Got 47 results, configured to fail if != 0

dbt tests query your transformed data and report rows that violate the constraint. A “failure” means rows were returned — it doesn’t mean the SQL broke.

Define tests in a schema YAML file:

# models/staging/schema.yml
version: 2

models:
  - name: stg_orders
    description: "Cleaned orders from the raw source"
    columns:
      - name: order_id
        description: "Primary key"
        tests:
          - unique
          - not_null

      - name: status
        tests:
          - not_null
          - accepted_values:
              values: ['pending', 'shipped', 'delivered', 'cancelled']

      - name: customer_id
        tests:
          - not_null
          - relationships:
              to: ref('stg_customers')
              field: customer_id

Run tests selectively to diagnose failures:

dbt test                              # All tests
dbt test --select stg_orders          # Tests for one model
dbt test --select stg_orders.order_id # Tests for one column
dbt test --select test_type:not_null  # All not_null tests

Inspect failing rows — dbt prints the compiled SQL. Run it directly in your database to see the actual failing rows:

# After a test failure, dbt shows:
# compiled Code at target/compiled/.../not_null_orders_order_id.sql

cat target/compiled/my_project/models/staging/schema.yml/not_null_stg_orders_order_id.sql
# Run this in your SQL client to see which rows fail

Control test severity — downgrade failures to warnings for non-critical checks:

columns:
  - name: customer_id
    tests:
      - not_null:
          severity: warn    # Logs a warning instead of failing the run
          warn_if: ">= 10"  # Only warn if more than 10 rows fail
          error_if: ">= 100"

Common Mistake: Writing dbt test and seeing failures, then assuming the data is wrong. Always check whether the model was actually run first — tests on a model that was never materialized test an empty or stale table. Run dbt run && dbt test to test fresh data.

Fix 7: Jinja Compilation Errors — Template Syntax

Compilation Error in model my_model (models/my_model.sql)
  Tag 'endfor' was unexpected (expected 'endif')

dbt uses Jinja2 for templating. Syntax errors in {{ }} expressions or {% %} tags prevent the SQL from compiling.

Check block pairing — every {% if %} needs {% endif %}, every {% for %} needs {% endfor %}:

-- WRONG — missing endif
{% if is_incremental() %}
WHERE updated_at > (SELECT MAX(updated_at) FROM {{ this }})
{% endfor %}   -- should be endif

-- CORRECT
{% if is_incremental() %}
WHERE updated_at > (SELECT MAX(updated_at) FROM {{ this }})
{% endif %}

Compile without running to see the generated SQL before executing:

# Compile all models — outputs to target/compiled/
dbt compile

# Compile a specific model and print to stdout
dbt compile --select my_model --quiet
cat target/compiled/my_project/models/my_model.sql

This shows the SQL after Jinja rendering — invaluable for debugging template logic.

Common Jinja patterns in dbt:

-- Model config block (must be at the top)
{{ config(
    materialized='table',
    schema='marts',
    tags=['daily', 'finance']
) }}

-- Conditional SQL
{% set payment_methods = ['credit_card', 'paypal', 'bank_transfer'] %}

SELECT
    order_id,
    {% for method in payment_methods %}
    SUM(CASE WHEN payment_method = '{{ method }}' THEN amount ELSE 0 END) AS {{ method }}_amount
    {% if not loop.last %},{% endif %}
    {% endfor %}
FROM {{ ref('stg_payments') }}
GROUP BY order_id

-- Environment-specific behavior
{% if target.name == 'prod' %}
    WHERE created_at >= DATEADD(day, -90, CURRENT_DATE)
{% else %}
    WHERE created_at >= DATEADD(day, -7, CURRENT_DATE)
{% endif %}

target variable gives you runtime context: target.name (dev/prod), target.schema, target.database, target.type (postgres/snowflake/etc.).

Fix 8: dbt_project.yml and Schema YAML Errors

Parsing Error
  dbt_project.yml: 'config-version' is not supported. Use version: 2
YAML Error
  yaml.scanner.ScannerError: mapping values are not allowed here

YAML is whitespace-sensitive. A single extra space or wrong indentation level breaks the entire config file.

Common dbt_project.yml mistakes:

# WRONG — model config without + prefix (required in dbt 1.0+)
models:
  my_project:
    materialized: table          # Error: unexpected key

# CORRECT — config keys need the + prefix in model paths
models:
  my_project:
    +materialized: view          # Default for all models
    marts:
      +materialized: table       # Override for models in marts/
      +schema: marts             # Custom schema suffix for marts

# WRONG — version not set correctly
config-version: 2
name: my_project

# CORRECT — use 'version' not 'config-version' in dbt 1.0+
version: 2
name: my_project
profile: my_project

Validate YAML syntax before running dbt:

python -c "import yaml; yaml.safe_load(open('dbt_project.yml'))"
# No output = valid YAML
# ScannerError = tells you the line number of the problem

For general YAML syntax errors, see YAML mapping values not allowed here.

Schema YAML files (schema.yml) follow strict structure. The version: 2 key is required at the top:

version: 2   # Required

models:
  - name: stg_orders
    description: "Staged orders"
    columns:
      - name: order_id
        description: "Surrogate key"
        tests:
          - unique
          - not_null

Indentation must be consistent — mixing tabs and spaces causes ScannerError.

Still Not Working?

Debugging with dbt debug and --log-level debug

dbt debug                            # Tests connection and profiles
dbt run --log-level debug            # Verbose output: SQL sent to the database
dbt run --select my_model --debug    # Compilation + execution details

The --debug flag prints the actual compiled SQL for every model before executing it, which makes it easy to spot Jinja rendering issues.

Snowflake Case Sensitivity

Snowflake uppercases all unquoted identifiers. Column names in your SQL are uppercased, but dbt references them in lowercase. This causes column not found errors on Snowflake that don’t occur on PostgreSQL:

-- Creates column ORDER_ID in Snowflake (uppercase)
SELECT order_id FROM raw.orders

-- Querying it — works because Snowflake is case-insensitive for unquoted names
SELECT order_id FROM stg_orders
SELECT ORDER_ID FROM stg_orders   -- Both work

The issue appears when you mix quoted and unquoted references. Avoid quoting column names in dbt unless absolutely necessary.

Running Specific Models with --select

# Single model
dbt run --select stg_orders

# Model + all upstream dependencies
dbt run --select +stg_orders

# Model + all downstream dependents
dbt run --select stg_orders+

# Both upstream and downstream
dbt run --select +stg_orders+

# All models with a tag
dbt run --select tag:daily

# Models in a directory
dbt run --select models/staging/*

# Exclude a model
dbt run --exclude stg_test

Integration with Airflow

dbt models are often scheduled via Apache Airflow using the DbtRunOperator from the astronomer-cosmos package or by wrapping dbt run in a BashOperator. If your Airflow-triggered dbt runs fail but manual runs succeed, check that the Airflow worker has access to ~/.dbt/profiles.yml (or the --profiles-dir flag points to the right location) and that all DBT_* environment variables are passed to the task. See Airflow not working for task environment variable patterns.

dbt Cloud vs dbt Core

dbt Cloud hosts the profiles, schedules runs, and provides a browser IDE — no local profiles.yml needed. If you develop in dbt Cloud but run tests locally with dbt Core, you need a local profiles.yml that mirrors the dbt Cloud connection. Export the connection details from dbt Cloud’s “Connections” section and use dbt debug locally to confirm they work.

F

FixDevs

Solo developer based in Japan. Every solution is cross-referenced with official documentation and tested before publishing.

Was this article helpful?

Related Articles