Skip to main content

Add Staging robots.txt in GHL

Purpose

Documents the process for adding a robots.txt file to staging subdomains in GoHighLevel to prevent search engines from indexing non-production environments.

Scope

Covers robots.txt setup in GHL for staging environments only. Does not include production domains or other GHL configurations.

Prerequisites

  • Access to GHL admin dashboard for the target staging domain
  • Familiarity with robots.txt syntax
  • Approval from Web Dev or SEO lead

Step-by-Step Procedure

Access Domain Configuration

  1. Log into GHL admin dashboard
  2. Navigate to Settings > Domains
  3. Locate your staging subdomain (example: staging.clientwebsite.com)

Configure robots.txt

  1. Click the Edit button for the relevant subdomain
  2. The Edit Domain dialog opens
  3. In the "Robots.txt code" box, enter the following:
User-agent: *
Disallow: /
  1. Click the Save button

Validate Configuration

  1. Visit https://staging.clientwebsite.com/robots.txt to confirm the file is active
  2. Verify the content matches the configured rules

Required Inputs

  • Admin permissions in GHL
  • Staging environment subdomain URL

Expected Outputs

  • Staging domain returns a robots.txt that blocks all crawlers
  • No staging content is indexed by public search engines

Troubleshooting

robots.txt Not Accessible

  • Wait up to 10 minutes for propagation
  • Double-check domain configuration in GHL settings

Content Still Indexed

  • Check Search Console for removal requests
  • Clear any cached versions if needed

Exceptions and Edge Cases

  • If staging domain must allow certain bots, consult SEO lead before specifying exceptions
  • Never use production robots.txt rules on staging environments
  • SEO Setup Checklist
  • GHL Admin Configuration Guide
  • Domain Management Procedures

Revision History

DateVersionChange DescriptionAuthor
2025-08-071.0Initial staging SOPSymphony Core Systems Team