LogoLogo
  • Welcome to Gretel!
  • Gretel Basics
    • Getting Started
      • Quickstart
      • Blueprints
      • Use Case Examples
      • Environment Setup
        • Console
        • SDK
      • Projects
      • Inputs and Outputs
      • Gretel Connectors
        • Object Storage
          • Amazon S3
          • Google Cloud Storage
          • Azure Blob
        • Database
          • MySQL
          • PostgreSQL
          • MS SQL Server
          • Oracle Database
        • Data Warehouse
          • Snowflake
          • BigQuery
          • Databricks
        • Gretel Project
    • Release Notes
      • Platform Release Notes
        • May 2025
        • April 2025
        • March 2025
        • February 2025
        • January 2025
        • December 2024
        • November 2024
        • October 2024
        • September 2024
        • August 2024
        • July 2024
        • June 2024
      • Console Release Notes
        • January 2025
        • December 2024
        • November 2024
        • October 2024
        • September 2024
        • August 2024
      • Python SDKs
  • Create Synthetic Data
    • Gretel Safe Synthetics
      • Transform
        • Reference
        • Examples
        • Supported Entities
      • Synthetics
        • Gretel Tabular Fine-Tuning
        • Gretel Text Fine-Tuning
        • Gretel Tabular GAN
        • Benchmark Report
        • Privacy Protection
      • Evaluate
        • Synthetic Quality & Privacy Report
        • Tips to Improve Synthetic Data Quality
        • Data Privacy 101
      • SDK
    • Gretel Data Designer
      • Getting Started with Data Designer
      • Define your Data Columns
        • Column Types
        • Add Constraints to Columns
        • Custom Model Configurations
        • Upload Files as Seeds
      • Building your Dataset
        • Seeding your Dataset
        • Generating Data
      • Generate Realistic Personal Details
      • Structured Outputs
      • Code Validation
      • Data Evaluation
      • Magic Assistance
      • Using Jinja Templates
  • Gretel Playground [Legacy]
    • Getting Started
    • Prompts Tips & Best Practices
    • FAQ
    • SDK Examples
    • Tutorials
    • Videos
    • Gretel Playground [Legacy] Inference API
    • Batch Job SDK
  • Reference
    • Gretel's Python Client
    • Gretel’s Open Source Synthetic Engine
    • Gretel’s REST API
    • Homepage
    • Model Suites
Powered by GitBook
On this page
  • Getting Started
  • Create a BigQuery Connection
  • Connection Creation Parameters
  • Create a Service Account
  • Configure Bucket IAM Permissions
  • Creating Connections

Was this helpful?

Export as PDF
  1. Gretel Basics
  2. Getting Started
  3. Gretel Connectors
  4. Data Warehouse

BigQuery

Read from and write to BigQuery.

Getting Started

Prerequisites to create a BigQuery based workflow. You will need

  1. A source BigQuery connection.

  2. (optional) A list of tables OR SQL queries.

  3. (optional) A destination BigQuery connection.

Do not use your input data warehouse connection as an output connector. This action can result in the unintended overwriting of existing data.

Create a BigQuery Connection

Google BigQuery related actions require creating a bigquery connection. The connection must be configured with the correct permissions for each Gretel Workflow Action.

For specific permissions, please refer to the Minimum Permissions section under each corresponding action.

Gretel bigquery connections require the following fields:

Connection Creation Parameters

Parameter
Description
Example

name

Display name of your choosing used to identify your connection within Gretel.

my-bigquery-connection

connection_target_type

source or destination depending on whether you want to read data from or write data to the connection.

source

project_id

ID of the Google project containing your dataset.

my-project-id

service_account_email

The service account email associated with your private key.

service-account-name@my-project-id.iam.gserviceaccount.com

dataset

Name of the dataset to connect to.

my-dataset-name

private_json_key

Private key JSON blob used to authenticate Gretel.

{ "type": "service_account", "project_id": "my-project-id", "private_key_id": "Oabc1def2345678g90123h456789012h34561718", "private_key": "-----BEGIN PRIVATE KEY-----/ ... }

Create a Service Account

In order to generate a private key you will first need to create a service account, and then download the key for that service account.

Configure Bucket IAM Permissions

After the service account has been created, you can attach dataset specific permissions to the service account.

Please see each action's Minimum Permissions section for a list of permissions to attach to the service account.

Creating Connections

First, create a file on your local computer containing the connection credentials. This file should also include type, name , config, and credentials. The config and credentials fields should contain fields that are specific to the connection being created.

Below is an example BigQuery connection credential file:

{
    "type": "bigquery",
    "name": "my-bigquery-connection",
    "connection_target_type": "source",
    "config": {
        "project_id": "my-project-id",
        "service_account_email": "service-account-name@my-project-id.iam.gserviceaccount.com",
        "dataset": "my-dataset"
    },
    "credentials": {
        "private_key_json": "..."
    }
}

Now that you've created the credentials file, use the CLI to create the connection

gretel connections create --project [project id] --from-file [credential_file.json]
  • Click the New Connection button.

  • Step 1, choose the Type for the Connection - Snowflake.

  • Step 2, choose the Project for your Connection.

  • Step 3, fill in the credentials and select Add Connection.

from gretel_client import create_or_get_unique_project
from gretel_client.config import get_session_config
from gretel_client.rest_v1.api.connections_api import ConnectionsApi
from gretel_client.rest_v1.models import (
    CreateConnectionRequest,
    UpdateConnectionRequest,
)

session = get_session_config()
connection_api = session.get_v1_api(ConnectionsApi)

project = create_or_get_unique_project(name="bigquery-workflow")

connection = connection_api.create_connection(
    CreateConnectionRequest(
        name="my-bigquery-connection",
        project_id=project.project_guid,
        type="bigquery",
        connection_target_type="source",    # other option: "destination"
        config={
            "dataset": "my-dataset",
            "project_id": "my-project-id",
            "service_account_email": "user@my-project-id.iam.gserviceaccount.com"
        },
        # note: best practice is to read in credentials from a file
        # or secret instead of directly embedding sensitive values
        # in python code.
        credentials={
            "private_key_json": "...",
        },
    )
)
PreviousSnowflakeNextDatabricks

Last updated 29 days ago

Was this helpful?

Navigate to the using the menu item in the left sidebar.

Connections page
Use service accounts  |  BigQuery  |  Google CloudGoogle Cloud
Introduction to IAM  |  BigQuery  |  Google CloudGoogle Cloud
Step 1, choose the Type of connection
Step 3, fill in credentials
Logo
Logo