When working with large volumes of data, data manipulation represents one of the biggest pain-points for analysts. The lack of an integrated ecosystem across multiple digital environments, business intelligence tools and data warehouses makes it difficult even for the most experienced practitioners to quickly import and export data.

Analysts fall prey to the same painful processes they have to go through every time they want to update a database, making it impossible to steer away from repetitive tasks.

 

Google Analytics and MySql

 

Google Analytics is the most widespread reporting platform in the world, millions of digital properties using it for real-time and historical data. However, as the volume of data increases, analysts have to extend their tool arsenal to data warehouses.

Many analysts have to download daily data from GA reports and append it manual to SQL tables, a sinuous and slow process.

Google Analytics does not provide by default a direct option to export data directly into your data warehouse. You can use their Reporting API to crawl data from reports. However, there’s one big issue: for the basic, free GA version,  reports are sampled, which means only a percentage of your data is actually attainable. Only Analytics 360 users have access to unsampled series.

The process of exporting data via third-party tools, or simply manually, is not only time-consuming for analysts, but it is also negatively impacting productivity at a company level.

The time spent manually importing and exporting data could be spent on tasks worthy of an analyst’s time, such as generating actionable insights.

However sinuous, there are undeniable advantages when leveraging MySQL, such as aggregating and comparing data from multiple websites. For example, if a publishing company has three different blogs, it could import metrics from all three and see how they compare to each other.

By using MySQL, you can also create and integrate more complex reports, or compare more relevant metrics. MySQL also offers high flexibility and high performance, making it the data warehouse of choice for thousands of analysts.  

So, what if you could import data from GA directly into MySQL, without any manual intervention? Lucky for you, we have the tool for that.

 

GA – Cognetik Data Streams – MySQL

 

Data Streams comes to the aid of analysts who struggle with this issue. The cloud-based tool makes it easy for analysts to extract and transfer data into their warehouse of choice – MySQL in this instance, eliminating time-consuming manual processes so you can get to insights faster.

On top of GA, it can also connect and pull data from the top marketing and clickstream sources, such as Adobe Analytics, DoubleClick, Facebook, and more.

Check our tutorial below to see how to use Data Stream. The first step is to connect your Google Analytics account to our platform.

We use the same security protocols used by Amazon, Microsoft, and Google. We don’t sell, store, or access your data in any way.

 

How to Link Your Google Analytics Credentials in Data Streams

 

After signing up for Data Streams, you need to link your Google Analytics credentials to use Google Analytics as a data source.

 

Here’s how you do that:

  1. Click ‘SET DATA EXTRACT SOURCE.

  1. Under ‘Connect to a New Data Source’, choose ‘Google Analytics’ from the drop-down menu.

  1. After choosing your platform, click ‘Link.

  1. Insert your Google credentials that have access to the Google Analytics account you want to set as a data source.

5. Grant the app permission to access your Google data by clicking ‘Allow.’

The second step would be to link MySQL to your Data Streams account.

 

How to Link Your MySQL Account in Data Streams

 

Linking your data warehouses credentials (Amazon Redshift, MySQL, MS SQL, Teradata, Snowflake, and PostgreSQL) in Data Streams

After setting up your Data Source, you need to set your Data Destination block.

  1. Click the Data Destination block.

  1. Choose the Data Warehouse you want to link from the drop-down menu.

3. Click ‘Link.’

4. Fill out the form and click ‘Submit.’

  • The alias can be any name that helps you identify this credential in the Data Streams app.
  • The server address is the address where your server is found. It can be a fully qualified name or IP address.
  • Username and Password – the Data Warehouse credentials.

 

  1. After linking your credentials, you can now go ahead and choose a database to move your data into, from the drop-down menu in the ‘Database’ field.

  1. You now have to choose to either move your data into an existing Table within the database your chose, or create a new one.

 

*If you choose an existing table, proceed by selecting a Table from the drop-down menu, and hit ‘Submit’.

About the author

Sebastian Stan

Sebastian is a journalist and digital strategist with years of experience in the news industry, social media, content creation & management, and digital analytics.