On March 12, 2019, a Google Broad Core Algorithm Update caused an instant 30% loss in sessions from organic search for an ecommerce website. In response, the site reduced its internal link graph from millions of pages to a couple of thousand by removing noindex filter and facet pages. A sharp recovery began with the September 24, 2019, Broad Core Algorithm Update. By the second week of November, the site had positive year-over-year sessions from organic search.
Did the removal of the noindex pages from the link graph cause the recovery?
I believe so, based on two reasons: the fact that the recovery happened and that it occurred as part of a Broad Core Algorithm Update.
Why would shrinking the link graph prompt the recovery?
Fewer pages in the link graph resulted in more PageRank for each remaining page.
What does link graph mean in this context?
It’s roughly similar to the crawl path from the home page of the website. It includes the pages that can be reached through internal links, starting from the home page (or any other page that is part of the link graph). It’s not the same as the backlink profile, which is made up of external links pointing to the site.
What happened to the noindex pages?
Other than being removed from the crawl path/link graph, not much. They are still linking to pages in the link graph, and to pages that are not in the link graph, but they are not receiving links from any pages that are in the link graph. The value of external links to these orphaned page blobs is at least partly passed on to the pages in the link graph.
Why was the site affected by the March 12 update?
This likely occurred because of negative changes in the site’s backlink profile in the months preceding the update.
How was it established that all those noindex pages were in fact noindex?
Millions of the pages were crawled with Perficient Digital’s proprietary web crawler.
Can reducing the link graph always reverse the effects of a Broad Core Algorithm Update?
Evaluating the bloat of a link graph is a good place to start. It’s often not an easy decision to lop off parts of the link graph. The removal of a page could theoretically impact the site’s relevance for a particular query or set of queries. In most scenarios, the solution has to be carefully implemented and the outcome is less predictable. In this case, it was an easy decision since the pages were already noindex and consequently not part of relevance considerations.
A site can lose traffic because of a Broad Core Algorithm Update for many reasons other than link graph bloat. Never assume by default that simply reducing the site’s link graph will reverse traffic loss caused by a Broad Core Algorithm Update.
Doesn’t Google explicitly say that traffic loss caused by a Broad Core Algorithm Update is not a sign that there is anything wrong with the affected website?
This particular traffic loss resulted in a revenue loss of about one million dollars per week, so simply saying, “There’s nothing to fix,” while doing nothing wasn’t an option. That said, a website can experience wild fluctuations from one Broad Core Algorithm Update to the next, without undertaking any noteworthy changes. But that is not the norm, in my experience.
Mats Tolander has 20 years of web development experience and he has been involved in search engine optimization for almost as long. Mats oversees Perficient Digital’s SEO programmer team, provides the senior marketing consultants and marketing associates with technical SEO support and training, and keeps up with changes in the search engine optimization landscape.