NOTE:

Before setting up your first Recipe, you'll need 3 things:

  1. A Google BigQuery project

  2. A service account key for that BigQuery project

  3. An account on Query.Recipes, with a plan selected and billing enabled.

If you're setting up a specific Recipe, you may want to head to that Recipe's 'Start Here' page:

  1. Website Quality Audit

  2. Organic Pulse


QR allows you to set up a fully-baked data pipeline in Google BigQuery, by filling out a short form.

The Recipe setup wizard configures 3 things for you:

  1. Creates the relevant BigQuery tables (for Google Analytics, Search Console, etc data) in your specified BQ project

  2. Configures raw data feeds for each of those data sources, to write data each day into those tables

  3. Schedules data models to run each time your raw data refreshes, to perform the actual analysis contained within the Recipe

After your last 2 years of data is backfilled in BigQuery (which takes 24-48 hours initially), you'll be able to set up data visualizations (in Sheets, Data Studio, or elsewhere) that make use of the Recipe's output.

Let's walk through the process, end-to-end.

The Recipe Setup Wizard

To get started, navigate to the Recipes page and select the recipe you'd like to set up.

For reference, the setup here will follow the Website Quality Audit, but the process follows the same general steps for any recipes:

1. Authorize Accounts

Enter the credentials from the required accounts

In this step, you will need to give authorization to setup the connection between QR and the various APIs needed for the recipe.

For the WQA example above, a single Google account authentication will connect Google Analytics and Google Search Console.

A Deepcrawl API key is provided by us, so there is no authentication required - but if you'd prefer to use your own account, you can connect via API key.

Other recipes may require additional accounts, such as Facebook Ads, Shopify, among others.

For any given service (Google, Facebook Ads, Shopify, etc), you can connect multiple accounts as needed.

2. Name your Site

Add the Name and Domain for the Recipe you are setting up

Next, you'll define the name and the domain of the site you're analyzing. Select these values carefully, as they'll be used downstream in reporting.

  • Name: What you'd like to call this site - this will be displayed in your reporting, so please name your sites case-sensitively how you'd like them to be displayed.

  • Homepage URL: Add the URL as it appears on the homepage, minus the 'http' or 'https' section at the beginning.

3. Select Accounts

Select the proper site + view in each of the required accounts. Also add the sitemaps to crawl.

In this step, you'll select the actual properties that belong to this site.

Dropdown menus appear, with available properties from each of the data sources (Google Analytics, Search Console, etc) that you connected in the first step.

Select the property that corresponds to your site - "Coding is for Losers: All Website Data", in our case for Google Analytics.

For any 3rd-party APIs where we provide access (like Deepcrawl), please always select the account named 'CIFL', which is our house account.

If you don't see the property you're looking for in the dropdown menus, you may need to authenticate an additional account for that service. You can do so via the "+" button to the right of the drop-down selector.

Authenticate an additional account through the

4. Select a BigQuery Project

If you haven't created a Google Cloud Account + your first Project, please follow this tutorial to get started.

Once your Project is created, select it from the dropdown menu of available Google BigQuery projects.

If you don't see the project you've created, you may need to authorize the Google account that's connected to that project (via the '+' button), or double-check that you have 'BigQuery Admin' permissions to the project.

Select a Big Query Project

5. Upload BigQuery Service Account Key

Add the service account json key you generated earlier

If you haven't yet created a Service Account for your BigQuery project, please follow this tutorial on how to do so.

Once it's created, upload upload the JSON keyfile that you downloaded into the drag-and-drop window.

After adding the key, select 'Next' to move on.

6. Confirm Details

The last step is to check and confirm the specifications of your site.

Please verify that all the accounts selected are the right ones and that the name and domain are correct.

If it's all good, select "Finish Setup" at the bottom and you are done!

Once your site is created, it'll appear on the Sites page, as well as on the Recipe Detail page for that Recipe.

After it's initially created, your site will appear with a status of Scheduled, which means all of the data backfills are being queued up.

Recipe will appear with the status

After your last 2 years of data has finished backfilling (within 24-48 hours), this status will change from Scheduled to Completed.

After that initial backfill, data will populate in your BigQuery warehouse each night going forward.

Viewing your Data in BigQuery

Once your site setup is done, you can see the setup in action within your BigQuery project.

The recipe setup within Big Query

Within your project, QR will set up a new dataset for each recipe that you configure.

Datasets are named with the structure: recipe_org.

Our org is named cifl, so if we set up the WQA recipe, our dataset is named wqa_cifl.

Any new site that you add for a given Recipe, will be written to that same dataset, so that data for all of your sites lives in the same place.

After your first site setup, the first set of tables you will see correspond to each of the raw data sources required by the Recipe.

Going back to our example with the WQA, you'll see 4 different tables:

  • deepcrawl_ all_pages_report_data_source

  • google_analytics_data_source

  • google_webmasters_data_source

  • qc

The first 3 tables correspond to each of the 3 core sources of the WQA: Deepcrawl, Google Analytics, and Google Search Console.

All of these tables are partitioned by month, which allows us to save on query time + cost when querying the tables.

The qc table, which aggregates the total for each metric for a specific date and source, is there for internal verification that the values being pulled are accurate.

Once your data finishes backfilling, you'll see a number more tables written to your dataset.

These are the data models, that execute the Recipe itself.

These run in the background on QR, and populate a set of tables that you can use out of the box in your analysis.

A view of all the different tables created by the SQL models

Although this looks like a lot of tables, for each Recipe there are 1-4 key tables that you'll actually use day-to-day.

For example, for the WQA, we generally use the actions_data_studio table more than any other.

These key tables are documented on each Recipe's detail page, and will generally be the tables that are used in Data Studio report templates.

Setting up Report Templates

The last step is to connect BigQuery to a reporting platform.

Regardless of what tool you end up using for reporting, QR allows you to save your visualizations from the 'Sites' table for easy access in the future.

Google Data Studio

With each Recipe, we provide a reporting template that you can copy and use directly in Google Data Studio.

If you haven't yet, check our article on Copying Data Studio Reports.

Google Sheets

If Google Sheets is your tool of choice, you can always export entire tables or query results back to Sheets for analysis, also directly from the BigQuery console.

You can find more detailed info here on connecting BigQuery and Sheets.

Other Reporting Tools

You're also welcome to create your own reports in Data Studio, or your favorite reporting tool - we're happy to help you navigate your options there, just drop us a line.

In Raw SQL

And, if you're feeling creative, and want to write your own SQL queries to use the output of your Recipe, check out this tutorial on learning BigQuery SQL.

Happy data chef'ing!

Did this answer your question?