What tools am I using for SEO?

By Martijn Scheijbeler Published September 3, 2017

A while back somebody posted the SEO platforms/vendors/tools that he was using at his agency job (as an SEO). Me missing some great tools in there decided to respond but it also got me thinking about my own toolset and decided to dedicate a blog post to it, to get better recommendations and learn from others what they’re using but hopefully also to shine some light on what I am looking for in tools. This is not all of it and I din’t really have time to explain in detail what I’m using specific tools for (I might dedicate some posts over time to this). But at least wanted to give you a first look. So here we go..

In general I have three requirements for tools:

  • It should be easy to use & user friendly, no weird interfaces and stuff that only works (90% of my tools).
  • The most data/features available, or the opposite: have a very specific focus on 1 element of what I’m looking for.
  • They must have an API, so I can build things on top of it, preferably this is included in the pricing of the tool (normal for most tools these days).

Google Search Console, Bing Webmaster Tools, Yandex Webmaster Tools

Obviously Google Search Console is the tool that really matters out of the three. As most of my time is being spent managing our visibility in Google. My favorite reports are Search Analytics for getting a quick overview in our performance (we use most of their data outside of it, by using their API/R library). Structured Data (don’t forget about the Structured Data Testing Tool) to track what we’re doing with Schema.org on our pages. From time to time I might look into the Index Status report when I’m dealing with multiple domains at the same time.

One of the reasons why I like Bing Webmaster Tools is that their Index Explorer enables you to find directories & subdomains that exist on the site. A great benefit if you’re just getting started with a new site. Still after years at The Next Web and these days at Postmates I’m find out about folders or subdomains that you never hear about on a day to day basis but might cause issues for SEO.

Google Analytics & Google Tag Manager

You get the point on this one right? You’re tracking your traffic and the combination of the two can help you track all the contextual data through custom dimensions or other metrics/dimensions that will help you understand your data better. I’ve blogged about them many times on The Next Web while I was there and will remain to do so in the future.

Screaming Frog & Deepcrawl

Getting more insights in your technical structure is super valuable when you’re working on a technical audit. But ScreamingFrog for day to day use for subsets of data and Deepcrawl for weekly all-pages crawls are very powerful and help me get more insights into what kind of pages or segments are creating issues. I like to use them both as they have different reports and certain differences between tools help me better understand issues.

In my current toolset, Botify which I’ll mention later in this document, is a third option.

SEMrush & Google Adwords Keyword Tool

You always want more insights in keywords and you want to know more about them, that’s what both tools are great at. They give you a great basis for a keyword research which you can use as the start of your site’s architecture, keyword structures and internal links structures. In my previous blog post on Google Search Console I kicked off the basis for a keyword research based on that, if you want to take it easy: go with these tools (as a start).

Majestic

Majestic, might not be the most user friendly (hint & sorry!), but as they have one of the largest indexes it’s great for link research. In this case I definitely value data + quality over the friendliness of the tool.

AuthorityLabs / SERPmetrics

I still deeply believe in using ranking data, as I have the opportunity to do this at large scale & use the data for both national & local level it helps me get a better understanding in what’s happening in the rankings and mostly what’s moving. It doesn’t necessarily have to be that I’m interested in our own rankings or our competitors. But if certain features in the SERP suddenly move up it will help me understand why certain metrics are moving (or not). It’s a great provider of intelligence data that you can leverage for prioritization and measuring your impact.

AuthorityLabs used to be my favourite tool to use, these days as they changed their pricing model I switched over to SERPmetrics.

Botify / Servers

I’ll try to write a follow up blog post on this explaining how this data can help you in getting more insights into the performance of the features/products that you build. But getting more insights from the log file data that you have on your servers can be extremely useful (must add that this is a thing that mostly applies to big site SEO). Right now I’m using Botify for this.

Monitoring

For bigger sites it’s really hard to keep up to date on the latest changes as so many people are working on it. That’s why we want to make sure that you get alerted when important SEO features are changing. We’re using some custom scripts in Google Drive but also like to use SEORadar.

Google Cloud Platform

My former coworker Julian wrote a great blog post on how to scale up ScreamingFrog and run it on a Google Cloud server. It’s one of many use cases why you want to use the Google Cloud Platform. Besides their server, analyzing large data sets with BigQuery (with or without using their Google Analytics connection) provides you with a better ability to handle large sums of data (log file, internal databases, etc.).

APIs

  • Data: In addition to the tools I just listed, there are a few APIs that I’m using on a regular basis that are making my life easier as they’re providing a lot of data. They’re APIs to retrieve keyword volumes and related keywords, to handle things on a bigger scale you’re going to want to be able to work with APIs instead of dealing with Excel files.
  • Reporting: Most of the reports that you’re delivering on can be automated. That is one of the best things that can deliver a great timesaver. By using the Google Analytics reporting in Google Sheets, googleAnalyticsR, SearchConsoleR and the Google Analytics Reporting API V3 and V4.

What am I still looking for?

  • Quality Control & Assurance: Weekly crawls aren’t enough if things are messed up. You want to know this on an hourly basis. Mostly when things are moving so fast that you can’t keep track of changes anymore.
  • More link data: Next to Majestic it would be great to be able to combine the datasets of others as well when doing this research. Doing this manually is doable but not on a regular basis.
  • More keyword data: When you start your keyword research you can just start with a certain set of keywords. But it could be that you’re forgetting about a huge set of keywords in a nice related industry. I’m exploring how to have more keywords to start your keyword research with (we’re not talking about 19 extra keywords here, more like 190.000 keywords).

I’m sure the set of tools will keep evolving over the next months when new things happen. I’d love to learn more about the tools that you’re using. Shoot in the comments or on Twitter what I should be using and I’ll take a look!