Add Staging robots.txt in GHL
Purpose
Documents the process for adding a robots.txt file to staging subdomains in GoHighLevel to prevent search engines from indexing non-production environments.
Scope
Covers robots.txt setup in GHL for staging environments only. Does not include production domains or other GHL configurations.
Prerequisites
- Access to GHL admin dashboard for the target staging domain
- Familiarity with robots.txt syntax
- Approval from Web Dev or SEO lead
Step-by-Step Procedure
Access Domain Configuration
- Log into GHL admin dashboard
- Navigate to Settings > Domains
- Locate your staging subdomain (example: staging.clientwebsite.com)
Configure robots.txt
- Click the Edit button for the relevant subdomain
- The Edit Domain dialog opens
- In the "Robots.txt code" box, enter the following:
User-agent: *
Disallow: /
- Click the Save button
Validate Configuration
- Visit https://staging.clientwebsite.com/robots.txt to confirm the file is active
- Verify the content matches the configured rules
Required Inputs
- Admin permissions in GHL
- Staging environment subdomain URL
Expected Outputs
- Staging domain returns a robots.txt that blocks all crawlers
- No staging content is indexed by public search engines
Troubleshooting
robots.txt Not Accessible
- Wait up to 10 minutes for propagation
- Double-check domain configuration in GHL settings
Content Still Indexed
- Check Search Console for removal requests
- Clear any cached versions if needed
Exceptions and Edge Cases
- If staging domain must allow certain bots, consult SEO lead before specifying exceptions
- Never use production robots.txt rules on staging environments
Related Documents
- SEO Setup Checklist
- GHL Admin Configuration Guide
- Domain Management Procedures
Revision History
| Date | Version | Change Description | Author |
|---|---|---|---|
| 2025-08-07 | 1.0 | Initial staging SOP | Symphony Core Systems Team |