The Valohai Ecosystem database connectors allow you to easily run database queries on AWS Redshift, GCP BigQuery and Snowflake. You can provide the information required for authenticating the connection as environment variables in Valohai. For Redshift and BigQuery it is also possible to use machine identity (IAM roles) to authenticate the database connection.
You can define the query in the Valohai UI and it will be saved with other details related to the execution. Moreover, the query results will be saved as execution outputs and will be automatically uploaded to your data store to be used in other jobs you run in Valohai, for example. Like any other execution output, the result file will get a datum URL, which allows you to connect a datum alias to it.
Below, you can find instructions how to connect to your database on GCP BigQuery.
Prerequisites:
- BigQuery workspace on your GCP account with imported data.
- GCP Service Account with BigQuery Data Viewer and BigQuery User roles.
- A way to authenticate with the database, either by:
- Keyfile for the service account
- The service account has been connected to your Valohai environments.
If you want to connect the service account to your Valohai environments, share its email with your Valohai contact.
Add environment variables
You will need to define the following environment variables for the execution:
- GCP_PROJECT, your Redshift cluster identifier.
- GCP_IAM, 1 by default, set to 0 to use login with the keyfile instead of the service account attached to the worker.
If using login with the service account keyfile (i.e. GCP_IAM is set to 0), define also also:
- GCP_KEYFILE_CONTENTS_JSON, service account keyfile.
- Remember to save this as a secret
Create a Valohai execution with the AWS Redshift connector
In order to to create and execution with the Redshift connector, follow the instructions below.
-
Open your project at app.valohai.com.
- Click on the Create Execution button under the Executions tab.
- Expand the step library by clicking on the plus sign next to valohai-ecosystem in the left side menu.
- Choose bigquery-query step under the Connectors.
- (Optional) Change the settings, such as the environment or Docker image, under the runtime section if needed.
- Write your SQL query into the field under Parameters sections.
- By default the query results will be saved as results.csv but you can also define some other path for the output.
- (Optional) You can give the output file a datum alias, e.g.
bigquery-query
, to easily refer to that in your executions with thedatum://bigquery-query
.
- If you did not add save the environment variables under project Settings or as an environment variable group on organization level, add them under the Environment Variables section.
- You can edit and/or add new environment variables if needed.
- Click on Create Execution.
Comments
0 comments
Please sign in to leave a comment.