Google recalled that it has stopped supporting noindex in robots.txt since September 1

Google reminded webmasters that starting September 1, it would stop supporting the noindex directive in robots.txt. The search engine sent a message about this   in July through notifications in the Search Console. 

Google Webmasters Tweeted “Just a reminder that September 1 is coming soon. The noindex directive in robots.txt will no longer be supported. Use the other options mentioned in our blog.

In particular, as a replacement for noindex in robots.txt, Google advised using noindex in the robots meta tags, 404 and 410 codes, password protection, the URL removal tool in Search Consol, disalow in robots.txt.

Recommended For You

Jay Galaczi

About the Author: Jay Galaczi

Leave a Reply

Your email address will not be published. Required fields are marked *