Rasgo is a metadata-only product, meaning all of your actual rows and columns stay in your data warehouse and Rasgo interacts with your data via dynamically generating SQL.
Rasgo performs both reads and writes to BigQuery:
- Rasgo catalogs tables and views in any project and dataset it has access to
- Rasgo dynamically generates and executes SQL on behalf of the user to transform and analyze data
- Rasgo can publish new tables and views into a single
Rasgo needs the following IAM permissions to run:
In Google Cloud storage console, create a bucket with default settings that you want Rasgo to use when exporting CSVs, and share the name of the bucket with Rasgo. Once configured, your users can trigger a CSV download through the Rasgo UI, and Rasgo will handle exporting query results to this bucket and generating a download URL
After creating the bucket, go to the bucket's page and click on
In the Rasgo UI, enter the BigQuery project and dataset that Rasgo should write to, as well as the service account key credentials.
Configuration is complete! You're ready to start using Rasgo.
Rasgo supports individual user credentials when executing queries on BigQuery via BigQuery OAuth integration. This is an optional step for extra security.
When working on the configuration for the new OAuth Client ID, use these values:
- Application type: Web application
- Name: Rasgo
- Authorized redirect URIs: https://app.rasgoml.com/account/integration/bigquery
Click Create to create the Client ID and Secret, and save the values for each. Share them with Rasgo to complete your configuration.
Next up is importing tables: