Looking for ways to Load, Transfer and automate your Google Analytics data into Microsoft SQL Server, PostgreSQL, BigQuery, Redshift, Snowflake and other data warehouse?
Last year I wrote this article on how to automate data transfer between facebook ads data into multiple destinations, it seems like that is on google first page and also received few positive responses from readers. So I thought of just replicating the same thing for other tools as well. Hoping this saves people time testing out different approaches. Basically the approaches from the first article is same but the ingredients and recipe are different. let’s go through
Why do we need this Google Analytics Data automation?
Table of Contents
The important benefit of this usage is saving time.
Easy to scale the same approach whether you are a boutique agency or multi-brand owner who owns more than dozens of account.
In advanced analysis especially for machine learning and multiple data connection between different marketing/crm tools, we need a proper marketing datawarehouse
Other use cases are
- sending data into tableau for [view only] visualization
- Sending data into
datawarehouse for my data Scientists to do their analysis (some people Indeed directly pull to R studio or python env ironment)
Also, Each one of us works in different technology stack, so am going to share short snippets of every possible way and so you can explore further on this. This information provided here are only a awareness of different approaches and not a in depth explanation as each company mentioned below has more expert posts than I can provide.
Transferring to Google Sheet
You may just want to transfer the data directly into google sheet in an automated(hourly/daily) fetch for simple reporting like budget monitoring, creative report, or simple analysis from multiple linkedin ad account data. For this case, the best and affordable ways are
Transferring to BigQuery
Usages like the advanced analysis from 100+ accounts with terabytes of data, building a marketing data warehouse for business intelligence (BI) or for any machine learning analysis where analyst wanted to pull numbers into their R Studio or Python environment, it is efficient to fetch data from Bigquery or any storage solutions. There are multiple ways to do from Manually writing api to automate, so I would recommend considering ready-made solutions as it is easy to start faster
Transferring to Google Sheet and then to Bigquery
Cases like where you want to clean these data before it gets passed to Bigquery, you need some environment, Google has it’s beta solution of data prep to do this but not everyone can afford to expertise to achieve this or even you can do SQL queries to clean and merge data directly in Bigquery interface but you want the simplest solution where everyone can play a part in cleaning. Anyway, it’s up to your requirements. So you can
- Use Supermetrics/Funnel.io to import data into Google sheet, (Clean, merge or do the custom calculation)
- Then automate your google sheet into Bigquery. Simple explanation on how to do that from google
Transferring into Tableau directly
What If you are already using business intelligence tool like tableau and your company is not in google environment and Microsoft product. simple..
- Native solutions available in this link
- or this below video
Transferring to Google sheet and then to Tableau
A case you are not a SQL expert but you are expert at Excel formula and a big believer in coding is for losers or your requirement itself has something to do with Google sheet. Similar to Bigquery, Tableau also has a free inbuilt google sheet connector that makes our job easier to connect to Google sheet which means
- Use Supermetrics/Funnel.io to import data into Google sheet,
- Clean, merge, do a custom calculation
- Use Tableau google sheet connector and visualise in Tabluea
Transferring into Bigquery > R Studio/Python Environment > then to Tableau
What if you are a data scientist and expert analyst you want to get the terabytes of data into Bigquery as Google sheet cannot accompany your requirement of more than 2million cells? Or you want to pull data into Bigquery , then push to Rstudio/python environment for further data modeling or prediction algorithms and then push to Tableau.
- Blendo, Funnel.io, Xplentyto push data into BigQuery
- Use RStudio/Python to connect to BigQuery
- Do your stuff, then push into Bigquery
- Use Tableau inbuilt connector to connect BigQuery
Transferring to Microsoft Excel
You may say “My company is not in Google apps environment” or “Google sheet is for kids :D” or “Google sheet has lot of limitation and can be very slow sometimes”. No worries, free tool and paid tools availableF
- Supermetrics for Excel
Transferring to Google Data Studio directly
Ooh! Finally, the free visualization tool is available And for budget users, the perfect love comes from Google.
For direct transfer: