What is the noindex directive in SEO?

Definition

Noindex is a directive that instructs search engine robots not to include a specific page in their index. Applied via a meta robots tag or HTTP header, it is the primary technical lever for controlling which pages appear in search results.

The noindex directive is one of the most direct technical SEO controls available. It tells Googlebot: “crawl this page if you wish, but do not add it to the index and do not show it in search results.” Unlike robots.txt which blocks crawling, noindex allows the robot to access the page while preventing it from appearing in SERPs.

The two implementation methods

The most common method is the meta robots tag in the HTML head: <meta name="robots" content="noindex">. The alternative is an X-Robots-Tag HTTP header, which works for any URL including non-HTML resources. Both are equally valid; the meta tag is simpler for most CMS implementations.

When to use noindex

Noindex is the right tool for pages with no organic search value: login pages, admin panels, internal search results, thank-you pages, duplicate content variants, and low-value paginated pages. Keeping these pages out of the index improves overall site quality signals and focuses crawl budget on pages that matter.

Common mistakes with noindex

The most damaging error is applying noindex to a page that is also blocked by robots.txt. If Googlebot cannot crawl the page, it cannot read the noindex directive, so the page may remain indexed indefinitely. Another frequent mistake is leaving noindex on pages after a site launch — a development-phase directive forgotten in production.

Not immediately. Google must recrawl the page to process the noindex directive, which can take days to weeks. The URL Removal tool in Google Search Console can temporarily suppress a URL while the noindex directive takes permanent effect.

Yes — if crawlable. A noindexed page that Google can access will still have its outgoing links followed and link equity passed to their destinations. Noindex only controls SERP visibility, not link flow — an important distinction from robots.txt blocking.