Many times even after adding the meta noindex tag to the header of all pages on a site may not cause your website to be deindexed from Google. As it is up to meta robots to see the tag and decide whether or not to follow your instruction, it is important to know the other ways in which a site can deindexed.
1. Use Robots.txtYou can use the robots.txt file for your website, this will block Google's bots from crawling your website. Add the following code to the site’s robots.txt for blocking Google's bots:
2. Verify Search ConsoleNext, you need to verify the Google Search Console on your website to be able to use the 'Remove URLs' tool.
3. URL Removal Tool in GoogleThen you need to go to Google Index > Remove URLs, and request removal of individual pages or whole folders, from your entire subdomain. However, you can only remove pages that are on the subdomain of your Google Search Console.
4. Verify that Pages Are DeindexedThe final step is to check whether the pages you wanted to remove from Google Search Results are actually removed or not. You can use Google Advanced Search Operators or "site:dev.mydomain.com" operator to see if your website has been removed. If you want to check if a specifc web page has been removed or not, use "info:mydomain.com/specific-page". If you do not see any search results, it means you have successfully deindexed your website from Google. Read more at business2community.com