Optimizing internal links can be helpful for search engine optimization. The larger the website, the more that the optimization can help Google identify important pages.
Ecommerce websites have category pages, product pages, and duplicates of those pages. A good practice for ecommerce is crafting the Googlebot paths towards the important pages — the profitable ones. For this, search optimizers typically use robots.txt, canonical tags, and nofollows. But those methods are more like detours. Direct links are much better.
For example, if I were looking for a dog food category page to rank better, I would value internal links from copy-rich pages that are related to dog food. But if the website is big, how can we identify the pages to link from?
Internal Linking Opportunities
Start with a web crawler. Sitebulb, Screaming Frog, and DeepCrawl are the three I’m most familiar with. For this article, I’ll use Screaming Frog due to its popularity.
Step 1. Identify the pages you want to point the internal links to. For example, say that Guitar Center, a retailer, wants its Nylon Strings category page to rank higher. (I have no connection to Guitar Center other than as a customer.) Perhaps nylon strings have good margins, and the company would benefit from more organic-search visibility. Optimizing the internal linking structure could be a powerful signal to Google. But we need to crawl the entire site to take inventory of possible pages to link from.
Open Screaming Frog. Under Configuration > Custom > Search, there’s a “Custom Search” window. Here you can ask Screaming Frog to highlight any pages that match the term “nylon strings.” Screaming Frog will look through the source code of every page. When it finds this exact phrase in the code, it will log the page under the Custom tab.