Mistake which cost SEO Hero 50 positions in Google

30558 site views

Just a couple of days ago, I made a serious error in the technical optimization SEO Hero site in robots.txt.

The error cost me more than 50 Google SERP positions per day.

I forbade indexation technical pages in the robots.txt file all the technical pages.

For details about robots.txt — https://seoheronews.com/set-robots-txt

And it has made further changes to the pages of the site, specifically in Heroes page. Heroes is the future functionality of the site and the page is now empty. Created only architecture.

I forbade indexing pages Heroes through robots tag in the meta: noindex, follow, archive.

Quite different directives::

  • Noindex;
  • Follow;
  • Archive.

And Heroes section pages were in the map sitemap.xml.

For details about — https://seoheronews.com/seo-hero-sitemap

Surprisingly, Google indexed pages.

Of course I am extremely did not want and taken care of as prescribing prohibition of all pages indexed by meta-tag robots.

However, the pages were in Google search index.

Now, using the Jave programming language I develop the code dynamically generate the robots.txt file to register every page in robots.txt.

seo hero serp
Site views