Make strategic decisions about hard-to-get backlinks

Your website’s user experience is top-notch, your site architecture is solid, and you’ve created amazing content – ​​now is the time to start link building.

Link building is the sub-discipline of search engine optimization (SEO) that is sometimes criticized due to the practices of some spammers. However, in reality, it is a very important (and effective) aspect of internet marketing. Not only do quality backlinks drive referral traffic, they are also always the main factor used to rank a website organically in Google search results.

But where to start ? What sites do you want links from to increase your rankings?

The following methodology will help you evaluate which sites you should get links from and strategically choose leads based on acquisition difficulty and link authority.

First step: start with a broad search

We’ll start by creating a list of relevant, high-quality sites, which we’ll then evaluate as possible link targets. Here’s how:

First, search Google for a very broad keyword related to your website. For example, a car manufacturer might search [cars]; if you are a recipe site, search [recipes]. Since these pages are ranked for very broad terms, they tend to be higher quality and authoritative sources of links.

Configure Google to show you 50-100 search results per page so you can create a nice, long list. To do this, navigate to the cog in the upper right corner of the search results page and select “Search Parameters”.

configure google search to show more results

On the Search Settings page, navigate to the “Results per page” section and move the slider to the right. (To note: For this option to be available, you may need to set “Google instant predictions” to “Never show instant results”.)

set Google SERP to 50 or 100 search results

Once you’ve adjusted your settings, hit the blue “Save” button at the bottom and return to the search results page (SERP).

You should now have a page with 50-100 results, depending on where you place the cursor.

I recommend downloading a list of these pages to better analyze them. If you don’t already have the MozBar installed, add it to your browser. Once installed, click the toolbar icon to activate it on the search results page (you may need to refresh your tab). Click the icon next to the Moz logo to “Export SERP analysis to CSV”. The only column you will need is the URL column.

Use MozBar to export SERP results

Once you have your SERP URL list, run Screaming Frog SEO Spider, which will retrieve information about outbound links. (The free version should be sufficient for our purposes.) From the top menu, select “Mode->List” and click the “Download List” button, choosing “Enter manually…”

Open your MozBar generated CSV file in Excel, copy the URL column and paste the values ​​into the Screaming Frog field. Click “OK” and let Screaming Frog explore all these links.

Once the crawl is complete, go to the “Bulk Export” menu, choose “All Outlinks” and save it to your computer.

Open it in Excel and filter only the HREF type. Remove any false returned values ​​in the “Track” column. This ensures that non-equity links, CSS, Images and JavaScript are not considered in our analysis.

Step Two: Google Spreadsheet Assistant

Head to Google Drive and open the Screaming Frog export in Google Spreadsheets. (You can also use the SEO Tools for Excel add-on.)

Create a column called “Domain Source” and use the following formula (replace B2 with the first value in the “Source” column):


This formula will distill URLs to the subdomain level.

Now create another column called “Domain Destination” and use the following formula (replace C2 with the first value from the “Destination” column):


Drag the formula cell down in Excel.

Next, create a column called “Same Domain” with the formula:


Right here, I2 must be equal to the first value of “Domain Source” and D2 must be equal to the first value of “Domain Destination”. As always, swipe down. This formula will return “1” if it is an internal link and “0” if it is an external link.

Now create a pivot table. To do this in Google Spreadsheets, select all relevant cells and choose “Data->Pivot Table”. For rows, add a field for “Domain Source”. For Values, add two fields: one for “Same domain” summarized by SUM and one for “Same domain” summarized by COUNT.

configure your pivot table in Google Spreadsheets

Next, copy the values ​​from the pivot table to a new sheet using the “Paste as Values” function and delete the “Grand Total” row at the end. Rename “SUM of Same Domain” to “Internal Links” and rename “COUNT of Same Domain” to “Total Links”.

Create a new column called “External Percentage” (which represents the percentage of links external to this domain); set it equal to the total number of links minus the internal links to get the number of external links, then divide by the total number of links. The formula will look like this:


what will the pivot table look like

We will use this metric to gauge how easy or difficult it is to get a link from these domains. If a domain doesn’t have a lot of links, it’s less likely to link to your site.

Now find the domain authority for these domains. You can use the SEER SEO Toolkit in Google Spreadsheets with an API key, or use a bulk domain authority checker like Spacer and add it later. Domain Authority will be the metric we will use as a proxy for Link Authority.

Step Three: Visualize Our Link Prospects to Help Make Strategic Decisions

At this point, we can gauge the difficulty of a link using our “External Percentage” column and the backlink authority using our “Domain Authority” column. Let’s visualize this so we can make quick decisions!

Create the R Visualization

I prefer to create this visualization in R for aesthetic reasons, but you can also use Excel or Google Spreadsheets.

  1. Download the spreadsheet in CSV format from Google Drive by selecting “File->Download as->Comma Separated Values ​​(.csv, current sheet)”.
  2. To download R and start the program.
  3. Install ggplot2 by running the following command: install.packages("ggplot2") then install the grid: install.packages("grid")
  4. Finally, we can generate our graph with the following code:

link <- read.csv("C:{Directory}all_outlinks.csv")
qplot(link$Domain.Authority, link$Percent.External, size = link$Domain.Authority, color = link$Domain, label = link$Domain) + geom_hline(yintercept=mean(link$Percent.External), color="red") + geom_vline(xintercept=mean(link$Domain.Authority), color="red") + geom_text(color = "black", size = 2)


  • You will need to specify the path to your .csv file. Windows users should quote directories with a double backslash (ie, C:).
  • Column names may vary depending on how you named yours. When you run str(link)it will specify how to call these columns in R (see code in bold).

The code will display a graph like this (example for the keyword, “SEO”):

visualize link building insights in R

Note the four quadrants created by the red lines. The upper right quadrant represents our top link targets. These links have the highest domain authority and they are also the ones that refer the most to external sites. In this example for the “SEO” keyword, we see that represents a very good link that we should look for because it has a very high domain authority and it frequently connects to external domains. Therefore, a link from this domain should be easier to acquire.

Step Four: Start Building Links

Link building is an important part of SEO, but knowing where to start can be daunting. Using the methodology I’ve outlined will help you make strategic decisions about where to find the authority links that you actually have a chance of getting and that make the most sense based on your words. keys.

The opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.

New to Search Engine Land

About the Author

Paul Shapiro is director of strategy and innovation for Catalyst In Boston. Paul loves to get dirty with innovative SEO strategies. He also enjoys watching old horror movies, programming, collecting old artifacts, and writing about SEO on his blog, Search the wilderness.

Comments are closed.