NOTE: Before setting up your first Recipe, you'll need 3 things:

  1. A Google BigQuery project

  2. A service account key for that BigQuery project

  3. An account on Query.Recipes, with a plan selected and billing enabled

If this is the first-ever site setup, you'll want to also crack open this walkthrough of the site setup flow.

How we use the WQA

We do not recommend treating the Website Quality Audit's output as a punch list, to be worked through in one pass.

This is because new issues will *always* pop up with your sites - there is of course no such thing as done.

We instead recommend using the outputted action recommendations of the recipe as a menu: a list of options to choose from, when your team has time to embark on a technical, content or site restructuring sprint.

What to expect in terms of timing

After going through the setup wizard, a few processes will kick off for you in the background:

Immediately

A site crawl is kicked off in Deepcrawl.

1 year of data begins to backfill for Google Analytics + Search Console, which are written to BigQuery tables ending in _draft.

dbt models run to generate your table structure in BigQuery. These models are written in SQL, and define the logic of the Recipe.

At this point, you're free to set up your Data Studio report + Sheets workbook (links) while data is backfilling.

When data backfills complete

Quality control (QC) runs on your raw data, to check for completeness.

For Google Analytics as an example, this means pinging the API a second time, to guarantee that session totals match for a given date range.

Once data completeness if confirmed, data is copied from _draft to production tables and is ready to use.

dbt models run once again, to refresh data for your Data Studio report + Sheets workbook.

Performing an initial review

After your data is live for the first time, you'll want to perform an initial review of page type + action recommendations - this is where the Sheets workbook comes in.

First, we review page type tagging, to confirm that pages are properly bucketed into the correct site section.

Our favorite page type is "exclude", because it marks a page as not relevant for your SEO efforts, which allows you to narrow your focus when reviewing data.

Then, we review action recommendations, and tag relevant actions with a status:

  • Unreviewed

  • Updated: action recommendation has changed since first review

  • To schedule

  • Working

  • Final review: action recommendation has been recognized as fixed

  • Complete

  • Not relevant: action not to be worked - our personal favorite

You can also add notes + a priority to each action, to make your tasks easier to sift through down the line.

Ongoing daily updates

Each morning, raw data is pulled into BigQuery from your Google Analytics and Search Console accounts via API.

For Google Analytics, data is pulled t-1 (so Monday's data writes to BigQuery on Tuesday morning). For Google Search Console, data is pulled t-2.

That data is again QC'd, and dbt models run to refresh data in your Data Studio report + Sheets workbook.

Ongoing monthly updates

Each monthend, your site will be crawled with Deepcrawl, which also updates backlink counts via the Majestic API.

When that crawl completes, new action recommendations will be generated for the month, and a new 'report month' will be snapshotted.

So in February, you'll be analyzing your January 'report month' (since the crawl was run as of January monthend).

Your Data Studio report template + Sheets workbook automatically roll forward to the next 'report month' when we flip to the new month.

Questions?

Feel free to drop us a note in live chat. Happy chef'ing!

Did this answer your question?