fbpx

Verifying the Correct URL for robots.txt

Understanding and Configuring the Robots.txt File for Shopify and WooCommerce

The robots.txt file is crucial for controlling how search engines crawl and index your website. This tutorial will guide you through setting up and optimizing the robots.txt file for both Shopify and WooCommerce platforms.

Shopify

Shopify automatically generates a robots.txt file for your store. However, you can customize it by creating a robots.txt.liquid file in your theme’s Templates directory.

  1. Go to your Shopify admin panel.
  2. Navigate to Online Store > Themes.
  3. Click Actions > Edit Code.
  4. In the Templates directory, click Add a new template and select robots.txt.liquid.
  5. Add your custom rules in the robots.txt.liquid file. For example:
    {% raw %}
    User-agent: *
    Disallow: /checkout/
    Disallow: /cart/
    Disallow: /orders/
    {% endraw %}
  6. Save the file and your custom robots.txt rules will be applied.

WooCommerce

For WooCommerce, you need to manually create and upload the robots.txt file to your website’s root directory.

  1. Create a robots.txt file on your computer using a text editor.
  2. Add your custom rules. For example:
    User-agent: *
    Disallow: /checkout/
    Disallow: /cart/
    Disallow: /my-account/
    Disallow: /wp-admin/
  3. Save the file and upload it to the root directory of your WordPress installation using an FTP client or your hosting provider’s file manager.

Testing Your Robots.txt File

After setting up your robots.txt file, it’s important to test it to ensure it’s working correctly.

  1. For Shopify, simply navigate to https://yourstore.myshopify.com/robots.txt to view the file.
  2. For WooCommerce, navigate to https://yourdomain.com/robots.txt to view the file.
  3. Use Google’s Robots.txt Tester to validate your file.

By properly configuring your robots.txt file, you can control which parts of your eCommerce site are accessible to search engines, thereby optimizing your site’s SEO performance.