# BigQuery

Google BigQuery is a fully managed, serverless data warehouse that lets you run fast SQL queries on large datasets. It's ideal for analytics, reporting, and data pipelines — and it scales automatically as your data grows.

Using BigQuery as a Coupler.io destination lets you load structured data from any supported source directly into your data warehouse, without writing any ETL code.

## Why use BigQuery as a destination?

* **Centralize your data** — pull from marketing tools, CRMs, spreadsheets, and more into a single place for SQL-based analysis
* **Automate pipelines** — schedule data loads on any cadence without manual exports or scripts
* **Works with any Coupler.io source** — any source Coupler.io supports can be routed to BigQuery
* **Flexible schema control** — let BigQuery auto-detect column types, or define your own schema manually for full control
* **BI-ready** — data lands in BigQuery ready for Looker Studio, Power BI, or any SQL-compatible analytics tool

## Prerequisites

Before you start, make sure you have:

* A **Google Cloud Platform account** with a BigQuery project and dataset already created
* A **Service Account** with the correct IAM roles (see [Permission errors](https://docs.coupler.io/destinations/categories/database/big_query/common-issues) if you're unsure)
* A **JSON key file** downloaded from your GCP Service Account — this is how Coupler.io authenticates with BigQuery

## Quick start

{% hint style="success" %}
Have your GCP JSON key file ready before you begin. You'll upload it during the connection step. If you haven't created one yet, follow Coupler.io's guide on generating a Google Cloud JSON key file first.
{% endhint %}

## How to connect

{% stepper %}
{% step %}
**Add a source to your data flow.** In Coupler.io, create a new data flow and configure at least one source. This can be any supported integration — Airtable, Facebook Ads, Clockify, and so on. You must have a source connected before you can save and run the data flow.
{% endstep %}

{% step %}
**Select BigQuery as your destination.** In the Destination step of your data flow, choose **BigQuery** from the list of available destinations.
{% endstep %}

{% step %}
**Upload your JSON key file.** Click **Select file** and upload the JSON key file you downloaded from your GCP Service Account. This file authenticates Coupler.io with your BigQuery project. Once uploaded, click **Save** to establish the connection.
{% endstep %}

{% step %}
**Set your dataset and table name.** Enter the **Dataset name** exactly as it appears in BigQuery (for example, `my_analytics_dataset`). Then enter a **Table name** — you can use an existing table or type a new name, and Coupler.io will create it automatically on the first run.
{% endstep %}

{% step %}
**Configure the schema (optional but recommended).** By default, BigQuery auto-detects column types from your data. If you need precise type control — or if your source sometimes returns empty datasets — disable the **Autodetect table schema** toggle and enter your schema as JSON. See the [schema definition guide](https://github.com/coupler-io/knowledge-base/blob/main/destinations/how-to-generate-bigquery-schema.md) for the format.
{% endstep %}

{% step %}
**Choose a write mode.** Select **Replace** to overwrite the table with fresh data on every run, or **Append** to add new rows below existing data. Replace is best for snapshots; Append is best for building historical logs.
{% endstep %}

{% step %}
**Run the data flow.** Click **Save and Run**. Coupler.io will load your data into BigQuery. Once the run completes successfully, open BigQuery to verify your table and data.
{% endstep %}
{% endstepper %}

## Supported features

| Feature                          | Supported                   |
| -------------------------------- | --------------------------- |
| Replace mode                     | Yes                         |
| Append mode                      | Yes                         |
| Automatic scheduling             | Yes                         |
| Type enforcement (manual schema) | Yes                         |
| OAuth sign-in                    | No (JSON key file required) |
| Templates                        | No                          |
