Recently I have been running some comparative industry data reports for a client, looking at individual site metrics such as overall organic search visibility, link data, quantifying numbers of quality links defined by TrustFlow 20+ domains. (Majestic).
I thought that it would be interesting to see how closely the different sets link data correlated to overall website search visibility.
On the graph above I have plotted the link data for 34 websites all from within a single industry.
Two separate datasets have been plotted on the chart.
Orange – The total number of linking all root domains (The total number of individual websites that are linking to the subject site) Blue – The total number of Trustflow 20+ linking domains (The total number of ‘quality’ links to the subject site)
As the trendlines show, there appears to be a much closer overall correlation between rankings and higher quality linking domains, than there is between rankings and the overall number of total links.
I add in the usual caveat that we have to remember that correlation ≠ causation.
Some thoughts and takeaways
The correlation is much fuzzier towards the bottom of the axis, as smaller absolute numbers of links and rankings mean that the data can be more significantly skewed by outliers.
Looking at the absolute number of links as a competitor metric is a flawed approach to competition modelling. It is far more worthwhile to look at the number of higher quality links that the site has in order to see a more realistic scenario.
Other factors should also be considered, such as link relevance, page position, on page factors, site size and level of optimisation ec, all which will have an impact on the overall search visibility of a site.
Googles change in focus from penalising to discounting poor quality links highlights Google’s confidence in being able to identify and dial out the lower end of the link graph.
Links will continue to remain a core part of Google’s algorithm. I’m a firm believer that other metrics are being, and will be used to rank sites, such as user engagement metrics and eventually product and company review data. However link data is vital to the core algorithm, as good quality links are difficult to manipulate at scale, and are a good indication of multiple nuanced levels of relevance and quality.
Ultimately focusing on any single factor is going to be a distraction. Any credible site needs to be focused on developing a multidimensional marketing strategy. Everything from user experience, content strategy, PR and link development, paid search and advertising and needs to work in synergy in order of maximise the overall marketing return.