Google BigQuery Integration

Everything you need to know to bring your existing data from Google's BigQuery servers to Locale.

A
Written by Aditi
Updated over a week ago

Google BigQuery is a fully-managed, server-less data warehouse that supports Structured Query Language (SQL) built on top of the Google Cloud Platform

Pre-requisites

  1. A user with BigQuery Admin and Storage Admin roles for the Google BigQuery project

  2. An active Billing account associated with your BigQuery project

Steps

  • Step 1: Whitelist Locale’s IP Addresses

  • Step 2: Create a User and Grant Privileges

  • Step 3: Retrieve the Hostname and Port Number (Optional)

  • Step 4: Configure BigQuery as a Data Connection on Locale

Step 1: Create a Service account

  1. Log in to the Google Cloud Platform with an Admin or higher role and open the API Credentials Page. If prompted, select or create a project.

  2. Click the “Create credentials” button. On the dropdown that appears, chose “Service account key”

3. Provide the Service account name, Service account id (Auto populated), and description and click on Create and Continue.

4. On the following page, Use the Big Query Admin role as indicated below

5. Alternatively, you can create a custom role and attach it to the Service account. Following are the permissions that you need to have in that role.

  • bigquery.jobs.create

  • bigquery.jobs.get

  • bigquery.jobs.update

  • bigquery.datasets.get

  • bigquery.tables.list

  • bigquery.tables.get

  • bigquery.tables.getData

Step 2: Get Credentials for your service account

  1. Click on[Service Accounts](<https://console.cloud.google.com/iam-admin/serviceaccounts>) page and select the created Service Account.

  2. Head to the Keys section and then click on Add Key. Under key type, select JSON and hit “Create”

  3. A .json file will then download to your computer. Use this when setting up your Data Source.

Step 3: Get your Dataset ID and Location

  1. Go to your BigQuery instance.

  2. Select the Project ID.

  3. From the list of datasets for the project, choose the one you want the dataset ID and location.

  4. Copy the Dataset ID and Data Location. The dataset ID is displayed as project-name:dataset-ID. Copy only the dataset ID. For example, in the image shown below, the dataset ID is test-dataset.

Step 4: Setting up the Connection

  1. Click on Settings → Data Connections → New Data Connection

2. Select BigQuery under Data Connection to create a new DataSource Connection

  • Name of the Datasource: A unique name to identify the data source. You could have multiple databases/clusters connecting to Locale so you can use this name to differentiate between each of them within Locale’s platform uniquely.

  • Project ID: It is a globally unique identifier for your project where all your datasets and data tables are stored.

    To learn more about creating and managing your projects in the Google Cloud Platform, follow this documentation: https://cloud.google.com/resource-manager/docs/creating-managing-projects

  • Service Account Key: Upload your Service Account Key(JSON FILE) to give locale necessary permissions to hit queries on your databases to trigger alerts and create incidents on top of them. See Creating a Service Account for more details.

Note : Locale will cancel queries that run for more than 90 seconds. This is to protect your database from running rogue queries and also to prevent a backlog of alerts in the Locale system. If you have a use case where this needs to be increased then please get in touch with our Support.

Happy Alerting 🙌🏻


Blazing-fast operations minus all the constant firefighting👨‍🚒

Learn how to set up a control tower for your operations in under 15 minutes ️‍🔥

...with Locale for modern ops teams


Did this answer your question?