Digital Marketing

Google recalled that it has stopped supporting noindex in robots.txt since September 1

Google reminded webmasters that starting September 1, it would stop supporting the noindex directive in robots.txt. The search engine sent a message about this   in July through notifications in the Search Console. 

Google Webmasters Tweeted “Just a reminder that September 1 is coming soon. The noindex directive in robots.txt will no longer be supported. Use the other options mentioned in our blog.

In particular, as a replacement for noindex in robots.txt, Google advised using noindex in the robots meta tags, 404 and 410 codes, password protection, the URL removal tool in Search Consol, disalow in robots.txt.

About the author

Jay Galaczi

Jay Galaczi

Jay Galaczi is an SEO veteran, web designer and web developer. He specializes in technical SEO, user and customer experience.

Add Comment

Click here to post a comment