Robots.txt for WordPress – Best Settings, Examples & Complete Guide (2025)

If you run a WordPress website, you’ve probably heard about “robots.txt.” Many people ignore this file, but the truth is that robots.txt is an important part of SEO. This file guides search engines about which pages to crawl and which ones to avoid. If robots.txt is not properly configured, Google may not crawl your site correctly, which can affect rankings.

Robots.txt for WordPress

In this blog, we’ll understand what robots.txt is, how to create it in WordPress, the best recommended settings, and common mistakes to avoid.

What Is Robots.txt?

Robots.txt is a simple text file located in your website’s root folder. It gives instructions to search engine bots like Googlebot and Bingbot.

It tells bots:

  • -Which pages to crawl
  • -Which pages to avoid
  • -Which sections to restrict
  • -Where the sitemap is located

In simple words:

Robots.txt is your website’s traffic controller for search engines.

Why Robots.txt Is Important for WordPress

If you care about SEO, ignoring robots.txt can be a mistake. This file protects your website and helps Google crawl it correctly.

Benefits:

  • Improved crawling
  • Prevents duplicate pages from being crawled
  • Hides sensitive folders (like wp-admin)
  • Reduces server load
  • Maintains stable SEO performance

How to Check Robots.txt in WordPress

Just add /robots.txt at the end of your site URL:

https://yourwebsite.com/robots.txt

If WordPress doesn’t generate it automatically, you can create one manually.

How to Create Robots.txt in WordPress

There are 3 simple ways:

  1. Yoast SEO Plugin
  • Dashboard → SEO → Tools → File Editor
  • Create or edit robots.txt
  1. Rank Math Plugin
  • Rank Math → General Settings → Edit robots.txt
  1. Manually Using File Manager
  • Go to cPanel → File Manager
  • Open public_html
  • Create robots.txt file
  • Add your content and save

Recommended Robots.txt for WordPress

Below is a Google-friendly robots.txt that works for most WordPress websites:

User-agent: *

Disallow: /wp-admin/

Allow: /wp-admin/admin-ajax.php

Sitemap: https://yourwebsite.com/sitemap.xml

Explanation:

  • User-agent: * → Rules apply to all bots
  • Disallow: /wp-admin/ → Prevents admin area from being crawled
  • Allow: admin-ajax.php → Allows important AJAX functionality
  • Sitemap → Helps Google quickly find all pages

Common Robots.txt Mistakes to Avoid

  1. Blocking the Entire Site

Many people accidentally add:

User-agent: *

Disallow: /

This completely stops Google from crawling your whole website. Result: zero rankings.

  1. Blocking Important Folders

Beginners often block /wp-content/, which contains images, CSS, and JS. Blocking this breaks crawling.

  1. Creating Multiple Robots.txt Files

There should only be ONE robots.txt file. Multiple versions cause errors.

SEO Tips for Optimizing Robots.txt (2025)

  • Always add your sitemap
  • Don’t block unnecessary folders
  • Use robots.txt to improve crawl budget
  • Keep the file simple and clean

Impact of Robots.txt on SEO

Robots.txt is not a direct ranking factor, but it improves crawling, indexing speed, website performance, and reduces duplicate issues—indirectly improving rankings.

How to Test Robots.txt

Use Google Search Console → Crawl → Robots.txt Tester

Here you can:

  • Check errors
  • See blocked URLs
  • Verify allowed/disallowed pages

Robots.txt vs .htaccess – What’s the Difference?

  • Robots.txt gives instructions to bots
  • .htaccess controls server and security rules

Both serve different purposes.

Conclusion

Robots.txt is a small but powerful file that plays an important role in technical SEO. If configured correctly, it helps search engines crawl your site better and improves website performance. Use the recommended file above for best results.

Scroll to Top