Is Google indexing unnecessary search pages, 404 errors, or admin pages on your WordPress site? This can harm SEO, waste crawl budget, and pose security risks. This guide will show you how to block unwanted pages using PHP and robots.txt to improve rankings and performance.

Why Block Unnecessary Pages from Google Indexing?
- Improve SEO – Prevent duplicate content and help important pages rank higher.
- Optimize crawl budget – Ensure Google focuses on valuable content.
- Enhance security – Avoid exposing sensitive pages (e.g., admin pages).
How to Block Unnecessary Pages Using PHP Code
Step 1: Add the Code to Your Theme’s
First, open the functions.php
file of your active WordPress theme and add the following code at the end. This universal snippet works with multiple SEO plugins—just replace the hook based on your plugin.
add_filter( 'rank_math/frontend/robots', function( $robots ) { $url = home_url( $_SERVER['REQUEST_URI'] ); if(strpos($url,"/search/") !== false || strpos($url,"?s=") !== false || strpos($url,"404.php") !== false || is_404()) { $robots["follow"] = 'nofollow'; $robots["index"] = 'noindex'; } return $robots; });
Step 2: Customize the Code for Your SEO Plugin
Replace the hook in the code above with the correct one for your SEO plugin:
- Rankmath:
rank_math/frontend/robots
- Yoast SEO:
wpseo_robots
- AIO SEO:
aioseo_robots_meta
Step 3: Save and Test
After adding the code, save the functions.php
file and test it by visiting the blocked pages (e.g., search results or 404 pages). Use Google Search Console to verify that these pages are no longer indexed.

How to Block Pages Using robots.txt
In addition to PHP code, you can use the robots.txt
file to block Google bots from crawling unnecessary pages, including linked video pages, by using the “Disallow” directive. Here’s an example:
User-agent: * Disallow: /wp-admin/ Disallow: /?s Disallow: /?p Disallow: /wp-content/video/xx
This block prevents Google from crawling:
- Specific directories like
/wp-content/video/xx
. - Admin pages (
/wp-admin/
). - Search result pages (
/?s
). - Paginated pages (
/?p
).
While PHP prevents indexing, robots.txt stops Google from crawling certain pages. Using both methods ensures maximum control over how Google interacts with your site.
Important Notes
- Test before deploying: Always test changes on a staging site before applying them to your live site.
- Avoid over-blocking: Only block pages that are truly unnecessary. Blocking important pages can harm your SEO.
- Monitor results: Use Google Search Console to monitor changes and ensure the blocked pages are no longer indexed.
SEO Best Practices:
- Audit regularly – Use tools like Screaming Frog or Ahrefs.
- Optimize sitemaps – Include only important pages.
- Improve speed – Use caching plugins like WP Rocket.
Blocking unnecessary pages from being indexed is a powerful SEO technique that can significantly improve your WordPress site’s performance. By using PHP code and robots.txt
, you can take control of what Google sees, optimize your crawl budget, and enhance your site’s security.
As an experienced SEO professional, I highly recommend implementing these strategies to ensure your site ranks higher and performs better. If you have any questions or need further assistance, feel free to reach out!