From September 1, Google will stop supporting unsupported and unpublished rules within your robots.txt file. Despite never being officially documented by Google, adding noindex directives within your robots.txt file had been a supported feature for over ten years, But no longer now.
According to Google Webmaster blog company said “In the interest of maintaining a healthy ecosystem and preparing for potential future open source releases, we’re retiring all code that handles unsupported and unpublished rules (such as
Google listed the following options, the ones you probably should have been using anyway:
(1)
(2) 404 and 410 HTTP status codes: Both status codes mean that the page does not exist, which will
(3) Password protection: Unless markup is used to indicate subscription or paywalled content, hiding a page behind a login will generally remove it from Google’s index.
(4) Disallow in robots.txt: Search engines can only index pages that
(5) Search Console Remove URL tool: The tool is a quick and easy method to remove a URL temporarily from Google’s search results.
You can check how your