Three Invaluable Tools For Obtaining Data Insights
To improve the success of any digital marketing campaign, we need to obtain valuable insight from as much data as we can. Now that we are losing nearly all organic search keyword data from Google Analytics, we need to look at different sources of data to replace some of the insight that we have lost.
To help us sift through the enormous amounts of data available to us, there are many different tools to help automate the process of sorting through it. Below I will discuss the 3 most popular programs that I make use of on a daily basis, almost without fail. With each of these programs I will talk through how they can be used and the benefits of each:
It may not be the obvious first choice and you might be thinking “why is that in here?” There are different reasons to use Excel but here is a short list of mine:
We all use one of the following tools to obtain the link data of different domains:
- OpenSite Explorer
- Majestic SEO
- Google Webmaster Tools
- Link Research Tools
These are the most popular tools to get an accurate picture of what your backlink profile looks like (please see my previous post on how to visualise your link profile).
These tools allow you to download the links as a CSV, which allows for more data manipulation. It also gives us the chance to group links by Domain Authority or Page Authority etc.
Title Tag Creation
So you have been using the new Google Keyword Planner tool and you have seen a common trend in target keywords but you have 10s, 100s, 1000s of title tags to create. Well there is a concatenate feature that allows you to merge information in cells together, this saves a lot of time and stops you having to type out each title tag, which is not only tedious but also extremely time consuming.
Obtaining Traffic Data
You’ll have a lot of keywords that you know you are going to target, but you don’t want to manually add keywords to the Keyword Planner and then fill in the data in the spread sheet that you have created.
Fortunately there’s lovely API that’s has been created (you will need a Adwords account to set this up) that allows you to bring search volume directly into your spread sheet. More information can be found here – http://moz.com/blog/search-volume-data-excel
SEO for Excel
I can’t really talk about how amazing Excel is without talking about SEO for Excel. There are too many functions for me to list (luckily Niels has already listed them for me) but it basically covers:
- On Site
- Integration with Google Analytics
- Network Status
I like using it with the Link Analysis that I perform as it allows me to check if all the links that are listed are still working correctly. As this is an extension for Excel, the formulas appear in the top bar as they are “built” into the Excel program after the install.
This is a more obvious choice. I have mentioned this in my previous two posts but I cannot stress enough how valuable this tool is. The main function of this program is to crawl a domain and highlight all the resources and URLs that are associated with it.
The information it returns is incredibly valuable, such as types of redirects that are used or canonicals that are in place amongst others. It also reports on the most important SEO elements such as title tags or meta descriptions.
However, there are some features that many users may not be aware of. For example, there are tabs at the bottom that allow the user to view links into the page as well as out of the page. This allows for targeted improvement of internal linking structure. It means link equity throughout the site can be distributed in a more efficient and effective way.
As the program actively scans and analyses the data that appears in the source code, you can use the custom filters within the program to check things such as:
- Google Remarketing
- Google Analytics
- Google Conversion Tracking
These three tags have pieces of code that are unique to them; this means that you can actively seek pages which may be missing tracking codes. This allows for diagnosis of such issues, and allows them to be solved in a more effective manner.
For people who don’t know what this is, Apache Log viewer is a tool to analyse server logs. This is becoming a very popular tool in the office, I am now using it most days to review which pages on the site are being crawled.
The only thing you need for this to work is access to the server on which your website is hosted. You can then download the server logs and load them into the software. As server logs appear in massive text files with no organisation this allows you to review the crawled pages easily.
The main use for this program is to quickly identify areas of the site that are being crawled. This gives understanding in where the “crawl budget” is being spent by search engine crawlers. It also will show any duplicate content that may appear via filters or variables that might be enabled on the website. These can then be addressed via the robots.txt file on the site and in line with improved internal linking pages will result in more pages on the site being crawled.
Using all three of these programs together allow me to understand and create a baseline for a campaign. I can then work with the rest of the team to build an effective plan to increase the performance of the campaign.
Those are my favourite three programs which I use to get more insight into campaigns that I work on. What are the programs that you use to help you get more insight? Let me know in the comments.