fbpx

Checking if robots.txt Returns a 200 Status Code

Understanding Robots.txt and HTTP Status Codes

The robots.txt file is crucial for controlling how search engines crawl and index your website. A properly configured robots.txt file should return a 200 HTTP status code, indicating that the file is accessible and can be read by search engines.

Shopify

On Shopify, the robots.txt file is automatically generated and cannot be directly edited. However, you can create custom rules using the robots.txt.liquid template. To ensure it returns a 200 HTTP status code, follow these steps:

  1. Go to your Shopify admin panel.
  2. Navigate to Online Store > Themes.
  3. Click on Actions > Edit Code.
  4. In the Templates section, click Add a new template and select robots.txt.liquid.
  5. Add your custom rules and save the file.

Shopify will automatically serve this file with a 200 HTTP status code.

WooCommerce (WordPress)

For WooCommerce, you can manually create and edit the robots.txt file to ensure it returns a 200 HTTP status code. Here’s how:

  1. Access your WordPress dashboard.
  2. Navigate to Settings > Reading.
  3. Ensure the option Discourage search engines from indexing this site is unchecked.
  4. Using an FTP client or your hosting file manager, navigate to the root directory of your WordPress installation.
  5. Create or edit the robots.txt file and add your custom rules.
  6. Save the file and ensure it is accessible by visiting yourdomain.com/robots.txt.

To verify the HTTP status code, you can use online tools like HTTP Status or browser developer tools.