LogoLogo
Home PageLoginSQL Generator
  • 🖥️What is Rasgo?
  • 🚀Setting Up Rasgo
    • Connect Rasgo to your Data
  • 🛠️Using Rasgo
    • Modeling your Data
    • Prompt Guide
    • AI Notes
    • AI-Generated Documentation
    • AI Readiness Score
    • Reports
    • Additional Features
    • Admin Settings
  • 🎉What's New
  • Integrations
    • ❄️Snowflake
    • 🔍BigQuery
    • 🔴Redshift
    • 🧱DeltaLake (via Databricks)
    • 💬OpenAI
    • 🅰️Anthropic
    • ✨Gemini
    • ☁️dbt Cloud
  • 🔐API
    • Table Metadata
    • Column Metadata
  • Reference
    • Status Page
    • Frequently Asked Questions
      • Rasgo Architecture
      • Contacting Rasgo Support
      • What does Rasgo do with my data?
  • Rasgo Graveyard
    • PyRasgo 0.3
      • Source Methods
        • publish.source_data()
        • read.source_data()
        • get.data_sources()
        • get.data_source()
      • Feature Methods
        • feature.get_stats()
        • publish.features_from_source_code()
        • publish.feature_from_source()
        • publish.features()
        • read.feature_data()
        • get.feature_attributes()
        • get.features()
        • get.feature()
      • Collection Methods
        • collection.add_attributes()
        • collection.preview()
        • collection.get_compatible_features()
        • read.collection_snapshot_data()
        • read.collection_data()
        • get.collection_attributes()
        • get.collections()
        • get.collection()
      • Features yml file
      • version 0.3
    • Transforms Overview
      • Build your Own Transform
        • Argument Types
        • Make your own Transform
        • SQL Best Practices
        • Utilities
          • cleanse_name()
    • All Transforms
      • Aggregate String
      • Aggregate
      • Apply
      • Bin
      • Cast
      • Clean
      • Conditional Agg
      • Correlation
      • Cumulative Agg
      • Datarobot Score
      • Dateadd
      • Datediff
      • Datepart
      • Datespine Groups
      • Datespine
      • Datetrunc
      • Describe
      • Drop Columns
      • Dropna
      • Encode Values
      • Entropy
      • Extract Sequences
      • Filter
      • Funnel
      • Heatmap
      • Histogram
      • If Then
      • Join
      • Joins
      • Label Encode
      • Lag
      • Latest
      • Lead
      • Linear Regression
      • Market Basket
      • Math
      • Metric Plot
      • Metric
      • Min Max Scaler
      • Moving Avg
      • New Columns
      • One Hot Encode
      • Order
      • Pivot Table
      • Plot
      • Prefix
      • Profile Column
      • Query
      • Rank
      • Ratio With Shrinkage
      • Remove Duplicates
      • Remove Outliers
      • Rename
      • Replace Missing
      • Replace String
      • Reshape
      • Rolling Agg
      • Rsi
      • Sample Class
      • Sample
      • Sankey
      • Scale Columns
      • Select
      • Sliding Slope
      • Standard Scaler
      • Suffix
      • Summarize Flatlines
      • Summarize Islands
      • Summarize
      • Target Encode
      • Text To Sql
      • Timeseries Agg
      • To Date
      • Train Test Split
      • Union
      • Unions
      • Unpivot
      • Uppercase Columns
      • Vlookup
Powered by GitBook
On this page
  • How does Rasgo work with BigQuery?
  • Configure BigQuery for Rasgo
  • Step 1: Create the service account in Google Cloud Console
  • Step 2: Grant IAM permissions to the new service account
  • Step 3: Connect Rasgo to Your Data
  • Step 4: Create a bucket for CSV exports
  • Step 5 (Optional): Enable OAuth for your Users
  • Success!

Was this helpful?

  1. Integrations

BigQuery

PreviousSnowflakeNextRedshift

Last updated 1 year ago

Was this helpful?

How does Rasgo work with BigQuery?

Rasgo is a metadata-only product, meaning all of your actual data stay in your data warehouse and Rasgo queries it there via dynamically generating SQL.

Rasgo performs reads-only operations in your BigQuery env:

  • Rasgo reads the information schema for tables and columns it has access to

  • Rasgo dynamically generates and executes SQL on behalf of the user to analyze data

Configure BigQuery for Rasgo

Step 1: Create the service account in Google Cloud Console

Follow to create a service account and get a JSON service account key.

Step 2: Grant IAM permissions to the new service account

Rasgo needs the following IAM permissions to run:

IAM Permission
Description

Editor permissions to read and write from a single project.dataset (i.e. analytics.rasgo )

User permissions to run queries in the same project as above

Viewer permissions to read from all projects and datasets that should be cataloged by Rasgo

Create CSVs for download

View downloaded CSVs download

View buckets for download

Step 3: Connect Rasgo to Your Data

In the Rasgo UI, enter the BigQuery project and dataset that Rasgo should write to, as well as the service account key credentials.

After this step, you're ready to start using Rasgo! Consider the remaining steps to enhance your users' experience:

Step 4: Create a bucket for CSV exports

While using Rasgo, users have the ability to select a GCS bucket and download CSVs to it. This can be set in the User profile screen:

Rasgo strongly recommends setting up a lifecycle rule for the CSV files that are downloaded to this bucket.

Step 5 (Optional): Enable OAuth for your Users

Rasgo supports individual user credentials when executing queries on BigQuery via BigQuery OAuth integration. This is an optional step for extra security.

Setting up BigQuery OAuth with Rasgo

When working on the configuration for the new OAuth Client ID, use these values:

  • Application type: Web application

  • Name: Rasgo

  • Authorized JavaScript Origins: https://app.rasgoml.com

  • Authorized redirect URIs: https://app.rasgoml.com/account/integration/bigquery

Click Create to create the Client ID and Secret, and save the values for each. Share them with Rasgo to complete your configuration.

Success!

Configuration is complete! You're ready to start using Rasgo.

To enable this functionality, Google Admin should ensure that users have access to view and write to a preferred bucket. To create a bucket in Google Cloud storage console: .

After creating the bucket, click on the Lifecycle tab on the bucket's details page. that removes the files on the bucket after age of 1 day (or your preferred lifecycle).

To configure the OAuth integration, you need to create a Client ID and Secret in Google Cloud Console. are Google's instructions on how to do that.

🔍
these instructions in the BigQuery docs
create a bucket
Add a new delete rule
Here
BigQuery Data Editor
BigQuery User
BigQuery Data Viewer
Storage Object Creator
Storage Object Viewer
Storage Bucket Viewer