File existence
Confirm robots.txt is available and returns a normal response. Missing files or unstable responses can complicate crawler behavior.
robots.txt can directly affect whether important parts of a site are crawlable before and after launch, so it should always be reviewed in final technical checks.
A misplaced rule, a leftover staging block, or an incorrect directive can create crawl and visibility issues that are easy to miss before delivery. For the broader process, use this WordPress pre-launch checklist, and go back to the main tool page when you are ready to run a check.
robots.txt is a crawl control file that tells compliant bots which paths they should avoid. In WordPress projects, it often changes during setup, migration, or staging, which makes it a high-risk item right before publication.
If rules are wrong at launch time, critical pages may become harder to discover, while non-essential paths may still be exposed to crawlers. A quick review prevents avoidable indexing and QA issues.
Confirm robots.txt is available and returns a normal response. Missing files or unstable responses can complicate crawler behavior.
Verify important public sections are not disallowed by accident. Broad rules can block areas that should stay crawlable.
Check for temporary staging directives that were never removed. Migration and launch windows often leave these behind.
Review that homepage and key public content are still crawlable as intended for production.
Ensure robots.txt directives match the intended indexing state and do not conflict with launch goals.
Frequent mistakes include keeping staging blocks in production, disallowing paths too broadly, or forgetting to update rules after domain changes. Another common issue is assuming robots.txt alone is enough to manage index visibility without validating the final setup.
These issues are usually small in isolation, but they can cause delayed discovery, inconsistent crawl behavior, or confusing launch outcomes.
PreFlight helps review launch-critical technical signals before delivery, including checks that affect crawlability and indexing readiness in WordPress projects.
It supports practical pre-launch verification, but it does not replace a full specialized audit when deeper analysis is required.
Run a technical check before launch so crawl-related issues are caught early and fixed before they affect production visibility.
Not always. It mainly controls crawl access for compliant bots, but indexed states can also depend on other signals and how pages were previously discovered.
Usually yes. Staging often uses stricter blocking, while production should align with your intended crawl and visibility setup.
No. robots.txt is only one part of technical indexing control and should be reviewed together with the rest of launch signals.