Upload a table to Google BigQuery from a Google Cloud Storage.

Parameters:

Parameters:
See dedicated page for more information.
BigQueryUpload loads a CSV file already stored in Google Cloud Storage (GCS) into a BigQuery table.
You pass a tiny schema table (two columns) on the input pin that names the destination columns and their types. The action then creates/overwrites/appends the destination table and performs the load.
A GCP project with BigQuery and Cloud Storage enabled.
A GCS bucket that contains the CSV file.
OAuth2 credentials: Client ID, Client Secret, and Refresh Token
(use your platform’s “Unlock Google Services” action to obtain them).
The service/user associated with the credentials must have:
roles/storage.objectViewer on the bucket (read CSV)roles/bigquery.dataEditor and roles/bigquery.jobUser on the datasetThe action expects a table with two columns:
| Column name (default) | Meaning |
|---|---|
| Before | Destination column names in BigQuery (one row per column) |
| Type | Destination data type code |
Type codes → BigQuery types
K → INTEGERF → FLOATU) → STRINGExample schema rows:
Before,Type
customer_id,K
amount,F
comment,U
Name of the CSV file in Cloud Storage bucket — Name of the CSV file in the Cloud Storage bucket.
Example: orders_2024_08.csv (bucket is set separately).
Name of the table in Big Query — Destination table name (no dataset). Example: TIMiTable.
Action to do if table already exists — What to do if the table exists:
overwrite (drop & create then load)append (keep table & insert rows)First row contain column name — First row contains column names (toggle). Leave ON if the CSV has a header row.
my-gcp-project.my-data-lake.sales_analytics.Tip: store these securely in your platform’s secrets vault and reference them.
nothing, basic).3).These options control how the helper converts values when deriving keys or floats from strings, and what to do on invalid cases. They are useful when your CSV contains imperfect data.
Convert float to key
abort | …abort | set to lower integerabort | set to lower integerConvert string to float
abort | …Convert ‘String/?’ to key
Additional JavaScript libraries — optional list if your platform supports custom parsing.
If column missing — issue warning and return index -1 (default) or abort.
Leave defaults unless you have specific cleansing rules.
Upload CSV to GCS bucket my-data-lake as orders_2024_08.csv.
Build schema input (two columns):
Before,Type
order_id,K
order_amount,F
notes,U
Set parameters
Name of the CSV file in Cloud Storage bucket: orders_2024_08.csvName of the table in Big Query: ordersAction to do if table already exists: overwrite (first run) / append (subsequent runs)First row contain column name: ONName of Google Project: my-gcp-projectBucket name: my-data-lakeName of dataset: sales_analyticsclientID, clientSecret, refresh_tokenNumber of retries on connection error: 3 (default)Run. Verify table sales_analytics.orders in BigQuery.
overwrite for deterministic rebuilds; use append for incremental ingestion.clientSecret and refresh_token as secrets; never hard-code them in flows shared with others.overwrite or append.bucket and Name of the CSV file in Cloud Storage bucket (names are case-sensitive).