LogoLogo
Home PageLoginSQL Generator
  • 🖥️What is Rasgo?
  • 🚀Setting Up Rasgo
    • Connect Rasgo to your Data
  • 🛠️Using Rasgo
    • Modeling your Data
    • Prompt Guide
    • AI Notes
    • AI-Generated Documentation
    • AI Readiness Score
    • Reports
    • Additional Features
    • Admin Settings
  • 🎉What's New
  • Integrations
    • ❄️Snowflake
    • 🔍BigQuery
    • 🔴Redshift
    • 🧱DeltaLake (via Databricks)
    • 💬OpenAI
    • 🅰️Anthropic
    • ✨Gemini
    • ☁️dbt Cloud
  • 🔐API
    • Table Metadata
    • Column Metadata
  • Reference
    • Status Page
    • Frequently Asked Questions
      • Rasgo Architecture
      • Contacting Rasgo Support
      • What does Rasgo do with my data?
  • Rasgo Graveyard
    • PyRasgo 0.3
      • Source Methods
        • publish.source_data()
        • read.source_data()
        • get.data_sources()
        • get.data_source()
      • Feature Methods
        • feature.get_stats()
        • publish.features_from_source_code()
        • publish.feature_from_source()
        • publish.features()
        • read.feature_data()
        • get.feature_attributes()
        • get.features()
        • get.feature()
      • Collection Methods
        • collection.add_attributes()
        • collection.preview()
        • collection.get_compatible_features()
        • read.collection_snapshot_data()
        • read.collection_data()
        • get.collection_attributes()
        • get.collections()
        • get.collection()
      • Features yml file
      • version 0.3
    • Transforms Overview
      • Build your Own Transform
        • Argument Types
        • Make your own Transform
        • SQL Best Practices
        • Utilities
          • cleanse_name()
    • All Transforms
      • Aggregate String
      • Aggregate
      • Apply
      • Bin
      • Cast
      • Clean
      • Conditional Agg
      • Correlation
      • Cumulative Agg
      • Datarobot Score
      • Dateadd
      • Datediff
      • Datepart
      • Datespine Groups
      • Datespine
      • Datetrunc
      • Describe
      • Drop Columns
      • Dropna
      • Encode Values
      • Entropy
      • Extract Sequences
      • Filter
      • Funnel
      • Heatmap
      • Histogram
      • If Then
      • Join
      • Joins
      • Label Encode
      • Lag
      • Latest
      • Lead
      • Linear Regression
      • Market Basket
      • Math
      • Metric Plot
      • Metric
      • Min Max Scaler
      • Moving Avg
      • New Columns
      • One Hot Encode
      • Order
      • Pivot Table
      • Plot
      • Prefix
      • Profile Column
      • Query
      • Rank
      • Ratio With Shrinkage
      • Remove Duplicates
      • Remove Outliers
      • Rename
      • Replace Missing
      • Replace String
      • Reshape
      • Rolling Agg
      • Rsi
      • Sample Class
      • Sample
      • Sankey
      • Scale Columns
      • Select
      • Sliding Slope
      • Standard Scaler
      • Suffix
      • Summarize Flatlines
      • Summarize Islands
      • Summarize
      • Target Encode
      • Text To Sql
      • Timeseries Agg
      • To Date
      • Train Test Split
      • Union
      • Unions
      • Unpivot
      • Uppercase Columns
      • Vlookup
Powered by GitBook
On this page
  • 1. Setup a dbt Cloud job
  • 2. Run your new job
  • 3. Copy your Service Token, account ID, and job ID
  • 4. Add the integration in Rasgo
  • 5. Import your data into Rasgo

Was this helpful?

  1. Integrations

dbt Cloud

Follow these steps to set up a metadata integration from dbt Cloud to Rasgo

PreviousGeminiNextAPI

Last updated 1 year ago

Was this helpful?

These instructions will help you set up the integration to sync metadata from dbt Cloud to Rasgo. If you're just wondering what will be sync'd from dbt Cloud, to the bottom of this page.

1. Setup a dbt Cloud job

Setup a new job in dbt Cloud to compile the SQL and docs for all dbt models in your project.

The dbt cloud job does not need to run the models, just compile the SQL.

When setting up the new job:

  • Choose your Production environment

  • Check the box for Generate Docs

  • In Commands, add this command: dbt compile --full-refresh

  • In Triggers -> Schedule, Choose to run on schedule, and have it run every day at exactly hour 8 (which is midnight PST; 3 AM EST)

  • Click Save

  • All done! You've made a new job

2. Run your new job

This job needs to run and generate metadata before Rasgo can import it. To run the job, click the green Run Now button, and wait until it completes and shows a green Success status for the run.

3. Copy your Service Token, account ID, and job ID

To set up the integration in Rasgo, you'll need 3 things:

  1. Read-Only service token for dbt Cloud

  2. dbt Cloud account ID

  3. dbt Cloud job ID

To generate the service token, go to your 'Account Settings' page in dbt Cloud and click 'Service Tokens' in the left side nav. Click 'New Token' and for the Permission Set, choose 'Read-Only'.

Click 'Save' and then make sure to copy the Service Token to your clipboard, because you won't be able to see it again.

Next, to get the IDs, you can pull them directly out of the URL. The URL will be structured like this:

https://cloud.getdbt.com/next/deploy/{{account_id}}/projects/{{project_id}}/jobs/{{job_id}}

Just save those two numbers, the account_id and job_id, for later and you're ready to go.

4. Add the integration in Rasgo

  • Find the dbt Connect section at the bottom

  • Paste in the Service Token, account ID, and job ID

  • Click "Connect"

5. Import your data into Rasgo

The integration is set up and good to go! Now, you can import your dbt models into Rasgo by clicking the "Import Now" button.

Click this button each time you wish to import or sync your models with Rasgo. By default, Rasgo will detect if a new dbt run has occurred since the last time you synced, and only run an import if it has.

If you need your dbt import to run on a recurring schedule, contact Rasgo support to discuss configuration options

Rasgo only imports metadata from dbt manifest files, it does not move actual data out of your cloud DataWarehouse or edit your dbt project.

Here is the metadata that Rasgo will ingest from dbt Cloud:

  • dbt models -> Rasgo datasets

    • Description

    • Lineage

    • Columns

      • Column descriptions

    • SQL

  • dbt metrics -> Rasgo metrics

    • Metric definition

Log into Rasgo and navigate to the account management screen:

☁️
https://app.rasgoml.com/profile
scroll down
Create a new Read-Only Service Token in dbt Cloud