Looking for ways to automate your twitter ads data into Google sheet or bigquery or other data warehouse?
Last year I wrote this article on how to automate data transfer between facebook ads data into multiple destinations, it seems like that is on google first page and also received few positive responses from readers. So I thought of just replicating the same thing for other tools as well. Hoping this saves people time testing out different approaches. Basically the approaches from the first article is same but the ingredients and recipe are different. let’s go through
Everybody including me if we don’t have penny to spend on the tools or not even aware of way to automate, we end up manually export the data from Twitter analytics and paste in Excel or Google sheet for reporting and analysis. Now there are multiple tools and approaches to automate this Twitter ads and analyticsespecially to bigquery, tableau, Google sheet,etc.
Why do we need this Twitter Data automation?
- The important benefit of this usage is saving time. Time is especially very limited for a startup with less resource
- Easy to scale the same approach whether you are a boutique agency or multi-brand owner who owns more than dozens of account or even a single account also still helpful.
- That’s an efficiency way, Why do you want a human to sit and prepare your reports every day when you can automate? Instead, we can enjoy our time in a more productive way than spending time manually on this.
- In advanced analysis especially for machine learning and multiple data connection between different marketing/crm tools, we need a proper marketing datawarehouse
- There is a simple use case like budget monitoring or creative report to Predictive data modelling/natural language processing that can be only achieved if we a have proper data infrastructure. Especially in marketing, it is critical as we are the one who strategizes the growth of the company by collecting audience data to analysing.
The reason why i choose Twitter as a third option to start with this series of data automation post is that Google products have been made easier to transfer(ready-made connectors) as Google has it own data warehouse like Bigquery. and Facebook is one of the second marketing channel providers where our money is spent and we need those data as well.
Some of the use cases, i usually use for different clients is
- Budget pacing monitoring
- Creative performance
- Audience performance
- And other regular KPI sheet automation
A bit advanced use cases are
- sending data into tableau for [view only] visualisation
- Sending data into data warehouse for my data Scientists to do their analysis (some people Indeed directly pull to R studio or python environment but it depends ya)
Also, Each one of us works in different technology stack, so am going to share short snippets of every possible way and so you can explore further on this. This information provided here are only a awareness of different approaches and not a in depth explanation as each company mentioned below has more expert posts than I can provide.
Transferring to Google Sheet only Stack:
You may just want to transfer the data directly into google sheet in a automated(hourly/daily) fetch for simple reporting like budget monitoring, creative report, or simple analysis from multiple twitter ad account data. For this case, the best and affordable ways are
- Supermetrics (Paid).
- Blockspring (Paid)
- Singer.io ( Opensource, coding knowledge required but just try for one day)
Transferring directly to BigQuery
Usages like the advanced analysis from 100+ accounts with terabytes of data, building a marketing data warehouse for business intelligence (BI) or for any machine learning analysis where analyst wanted to pull numbers into their R Studio or Python environment, it is efficient to fetch data from Bigquery or any storage solutions. There are multiple ways to do from Manually writing api to automate, so I would recommend considering ready-made solutions as it is easy to start faster
- Stitchdata (Bulk transfer for all data)
- Alooma (Write a schema and then transfer the required data)
- Singer.io ( Opensource, coding knowledge required)
Transferring to Google Sheet and then to Bigquery
Cases like where you want to clean these data before it gets passed to Bigquery, you need some environment, Google has it’s beta solution of data prep to do this but not everyone can afford to expertise to achieve this or even you can do SQL queries to clean and merge data directly in Bigquery interface but you want the simplest solution where everyone can play a part in cleaning. Anyway, it’s up to your requirements. So you can
- Use Supermetrics/blockspring/singer to import data into Google sheet, (Clean, merge or do the custom calculation)
- Then automate your google sheet into Bigquery. Simple explanation on how to do that from google https://cloud.google.com/blog/big-data/2017/09/how-to-crunch-your-business-data-from-sheets-in-bigquery
Transferring into Tableau directly
What If you are already using business intelligence tool like tableau and your company is not in google environment and Microsoft product. simple..
- Stitchdata https://www.stitchdata.com/integrations/facebook-ads/tableau/
- Cognetik, I never tried this though but looks very straightforward alhttp://cognetik.com/cloud-tableau-connector
Transferring to Google sheet and then to Tableau
A case you are not a SQL expert 🙂 but you are expert at Excel formula and a big believer in coding is for losers 😀 or your requirement itself has something to do with Google sheet. Similar to Bigquery, Tableau also has a free inbuilt google sheet connector that makes our job easier to connect to Google sheet which means
- Use Supermetrics/blockspring/singer to push the data into google sheet
- Clean, merge, do a custom calculation
- Use Tableau google sheet connector and visualise in Tabluea
Transferring into Bigquery > R Studio/Python Environment > then to Tableau
What if you are a data scientist and expert analyst you want to get the terabytes of data into Bigquery as Google sheet cannot accompany your requirement of more than 2million cells? Or you want to pull data into Bigquery , then push to Rstudio/python environment for further data modeling or prediction algorithms and then push to Tableau.
- Stitchdata/alooma/fivetran/blendco/singer to push data into big query
- Use RStudio/PythonE to connect to big query
- Do your stuff, then push into Bigquery
- Use Tableau inbuilt connector to connect bigquery
Transferring to Microsoft Excel
You may say “My company is not in Google apps environment” or “Google sheet is for kids :D” or “Google sheet has lot of limitation and can be very slow sometimes”. No worries, free tool and paid tools availableF
Transferring to Google Data Studio directly
Ooh! Finally, the free visualization tool is available 🙂 And for budget users, the perfect love comes from Google.
For direct transfer:
For a bit indirect way is tapclicks has custom solution as well
Ad, Analytics, Social Data by ReportDash
Transferring into Snowflake, Amazon redshift or Microsoft powerBi
In simple, list of the tools available for transfer to Bigquery, Google Sheet, Snowflake, Microsoft Excel
- Owox BI
- Adverity Data Tap
If you ask me what you want to start first, I would recommend you to start with supermetrics which i have been using for 3+ years and worth of paying $39 basic package. if it is more for visualization just go ahead with Data Studio as its free and can still use all google products, if you want more, then you can use supermetrics connectors.