Google BigQuery Setup

Step-by-step guide to connecting Google BigQuery as a data source in Klairr.

This guide walks you through connecting Google BigQuery to Klairr. You’ll need a Google Cloud project with BigQuery data and admin access to create a service account.

Prerequisites

  • A Google Cloud project with BigQuery enabled
  • At least one dataset with tables you want to query
  • Permission to create service accounts and manage IAM roles

Step 1: Create a service account

  1. Go to the Google Cloud Console
  2. Select your project
  3. Navigate to IAM & Admin > Service Accounts
  4. Click Create Service Account
  5. Enter a name (e.g., “klairr-reader”) and description
  6. Click Create and Continue

Step 2: Grant permissions

Assign the following roles to the service account:

RoleWhy it’s needed
BigQuery Data ViewerRead access to tables and views
BigQuery Job UserPermission to run queries

These are the minimum required roles. Do not grant broader access — Klairr only needs read permissions.

Step 3: Create a JSON key

  1. Click on your new service account
  2. Go to the Keys tab
  3. Click Add Key > Create new key
  4. Select JSON format
  5. Download the key file

Keep this file secure. You’ll paste its contents into Klairr in the next step.

Step 4: Add the data source in Klairr

  1. In Klairr, go to Data sources
  2. Click Add data source
  3. Select Google BigQuery
  4. Paste the entire JSON key file contents
  5. Select the project and dataset you want to query
  6. Click Connect

Step 5: Schema introspection

After connecting, Klairr automatically introspects your schema:

  • Tables — all tables in the selected dataset
  • Columns — column names, types, and descriptions (if set in BigQuery)
  • Enum values — sample values from low-cardinality columns (used for intelligent filtering)

This typically takes a few seconds. Once complete, you can start asking questions.

Cost estimation

Klairr shows the estimated BigQuery cost before executing each query:

  • Bytes to be processed — shown before execution
  • Estimated cost — calculated using BigQuery’s on-demand pricing ($5/TB)
  • Dry-run mode — in SQL Live Edit, you can check cost before running a modified query

Spend limits

Admins can set spend limits to prevent runaway costs:

  • Per-query limit — maximum bytes scanned per query
  • Daily limit — maximum total bytes scanned per day per user
  • Automatic LIMIT — Klairr injects a LIMIT clause to prevent full-table scans

Configure these in Settings > Data sources for each data source.

Troubleshooting

”Permission denied” error

  • Verify the service account has BigQuery Data Viewer and BigQuery Job User roles
  • Confirm the roles are assigned at the project level (not just the dataset level)
  • Check that the correct project is selected in Klairr

”Dataset not found”

  • Ensure the service account has access to the specific dataset
  • Verify the dataset exists in the selected project
  • Check for region restrictions on the dataset

Schema not loading

  • The service account may lack permission to list tables — verify the Data Viewer role
  • Try disconnecting and reconnecting the data source
  • Large datasets (100+ tables) may take longer to introspect

Queries timing out

  • Klairr injects a LIMIT clause by default, but complex queries on very large tables may still be slow
  • Consider creating views or materialized views for commonly queried data
  • Check if the dataset uses partitioning and ensure Klairr’s queries can leverage it

What’s next

Need help? Contact support · Start free