Don’t rely upon Ahrefs traffic when outreaching for backlinks, consider indexation ratios instead

Facebook
Twitter
LinkedIn

When performing an outreach backlink campaign people often use Ahrefs or SEMRush to figure out the likely level of traffic to a website and whether it is worth including in their shortlist to reach out to.

A lot of people would set out a minimum criteria such as:

DR20+

ST 500+

etc

My goto choice is Ahrefs and as great an SEO research tool as it is, their metric of traffic (ST) as an accurate indicator of real traffic is just plain wrong. In fact, it’s important to remember that third party metrics don’t accurately correlate to Google’s own metrics at all anyway.

I’ve got lots of examples of this but here’s one of my testing sites as shown in Ahrefs:

46 monthly traffic right?

But…

My site in the past 30 days has had nearly 8000 users.

Remember – Ahrefs, SEMRush etc are third party tools using their own metrics. Take them with a pinch of salt.

Indexation Ratios

I’ve been looking for an alternate way of trying to figure out how to find websites that I can trust when performing outreach. I did wonder if I could check how many page 1 keywords a website had, in the instance of my test site Ahrefs says “20”, and none of these correlate to the pages getting actual real traffic, not even close.

I’m still looking for a way to find a more accurate reading for website traffic but in terms of finding sites suitable for guest posts etc, in other words websites that haven’t fallen completely out of favour with Google, I started to consider a different way to determine whether Google still has love for a website and one way I’ve found this is to find their indexation ratio. This can be done by performing the advanced Google search site:domain.comand comparing this against the number of posts/pages inside the XML sitemap.

In my instance I have 177 posts and 6 pages, I have nothing else such as tags, custom post types, categories or other archives indexed:

Google have stopped showing the number of indexed pages when you perform site:domain.com(Keywords Everywhere still shows it but it can be pretty inaccurate so can’t be relied upon). However, there are 18 pages of Google results for my site which means everything is still indexed.

GSC shows only 12 pages “crawled – currently not indexed” and these are category pages which, as mentioned, I don’t usually index anyway.

So, this means Google is still indexing all of the posts and pages.

As per the third party tools, if my site is showing as not hitting those minimum metrics, therefore worthless and not to be trusted say for guest posts, then surely “crawled – currently not indexed” would be a lot higher and so would the number of posts/pages in the sitemap compared to what has been indexed.

So, I’m edging toward this being a more reliable method of ascertaining how much love Google still has for a website when considering my guest post opportunities – indexation ratios.

I do realise that there are much bigger websites than this and Google will only show so many indexed results, but in terms of just figuring out if Google has penalised a website altogether vs has decided not to show certain pages in it’s index it’s not a bad start.

Of course, my passion is for technical SEO and I just had to put this to the test.

Am I right?

Well, my results are usually good anyway but by no longer dismissing sites that previously wouldn’t get a second glance I’ve been able to open up the scope of possibilities. The end result has been overall keyword visibility growth across the board with KPI’s to match e.g. :

 

SEO Specialist at Graham SEO
Graham McCormack is a technical SEO and organic website traffic growth specialist based in Wirral, UK with more than 10 years experience in the industry.
Graham McCormack
Share the Post:

Related Posts