Setup a BigQuery Backend

The easiest ETL pipeline ever.

Dead Simple Setup

Setup is super easy. Simply add our service account to your IAM with BigQuery Data Owner and BigQuery Job User permissions.

Enable a Google Cloud Platform billing account with payment information or we won't be able to insert into your BigQuery table!

Add a Member

Add Our Service Account with Permissions

Our serivce account is [email protected].

Assign BigQuery Data Owner and BigQuery Job User permissions to the Logflare service account.

Set the Project ID in Logflare

Find your Google Cloud Platform project ID on your GCP dashboard.

Navigate to your account preferences and add your GCP project ID.

Update the Source TTL

Edit a source and set the TTL.

Query BigQuery

You can query any Logflare managed BigQuery table directly if you need to.

BigQuery has great support for nested records and Standard SQL both of wich we take advantage of. To query inside a nested record you must UNNEST it like so:

        
    SELECT timestamp, req.url, h.cf_cache_status
    FROM `your_project_id.your_dataset_name.your_table_name`,
    UNNEST(metadata) m,
    UNNEST(m.request) req,
    UNNEST(m.response) resp,
    UNNEST(resp.headers) h
    WHERE DATE(timestamp) = "2019-05-09"
    ORDER BY timestamp DESC
    LIMIT 10