Links
🔍

BigQuery

How does Rasgo work with BigQuery?

Rasgo is a metadata-only product, meaning all of your actual rows and columns stay in your data warehouse and Rasgo interacts with your data via dynamically generating SQL.
Rasgo performs both reads and writes to BigQuery:
  • Rasgo catalogs tables and views in any project and dataset it has access to
  • Rasgo dynamically generates and executes SQL on behalf of the user to transform and analyze data
  • Rasgo can publish new tables and views into a single project.dataset location

Configure BigQuery for Rasgo

Step 1: Create the service account in Google Cloud Console

Follow these instructions in the BigQuery docs to create a service account and get a JSON service account key.

Step 2: Grant IAM permissions to the new service account

Rasgo needs the following IAM permissions to run:
IAM Permission
Description
Editor permissions to read and write from a single project.dataset (i.e. analytics.rasgo )
User permissions to run queries in the same project as above
Viewer permissions to read from all projects and datasets that should be cataloged by Rasgo
Create CSVs for download
View downloaded CSVs download
View buckets for download

Step 3: Connect Rasgo to Your Data

In the Rasgo UI, enter the BigQuery project and dataset that Rasgo should write to, as well as the service account key credentials.
After this step, you're ready to start using Rasgo! Consider the remaining steps to enhance your users' experience:

Step 4: Create a bucket for CSV exports

While using Rasgo, users have the ability to select a GCS bucket and download CSVs to it. This can be set in the User profile screen:
To enable this functionality, Google Admin should ensure that users have access to view and write to a preferred bucket. To create a bucket in Google Cloud storage console: create a bucket .
Rasgo strongly recommends setting up a lifecycle rule for the CSV files that are downloaded to this bucket.
After creating the bucket, click on the Lifecycle tab on the bucket's details page. Add a new delete rule that removes the files on the bucket after age of 1 day (or your preferred lifecycle).

Step 5 (Optional): Enable OAuth for your Users

Rasgo supports individual user credentials when executing queries on BigQuery via BigQuery OAuth integration. This is an optional step for extra security.
Setting up BigQuery OAuth with Rasgo
To configure the OAuth integration, you need to create a Client ID and Secret in Google Cloud Console. Here are Google's instructions on how to do that.
When working on the configuration for the new OAuth Client ID, use these values:
  • Application type: Web application
  • Name: Rasgo
  • Authorized JavaScript Origins: https://app.rasgoml.com
  • Authorized redirect URIs: https://app.rasgoml.com/account/integration/bigquery
Click Create to create the Client ID and Secret, and save the values for each. Share them with Rasgo to complete your configuration.

Success!

Configuration is complete! You're ready to start using Rasgo.