fbpx

Ensuring robots.txt is Not Blocking Ranking Pages

Ensuring Robots.txt Does Not Block Important Pages

In eCommerce SEO, it’s crucial to ensure that your robots.txt file is not blocking the crawling of pages that need to rank, such as product pages, category pages, and blog posts. Here’s how you can check and update your robots.txt file for Shopify and WooCommerce.

Shopify

Shopify automatically generates a robots.txt file for your store. To customize it, you need to create a robots.txt.liquid file in your theme. Follow these steps:

  1. Go to your Shopify admin and navigate to Online Store > Themes.
  2. Click Actions > Edit Code.
  3. In the Templates section, click Add a new template and select robots.txt from the dropdown menu.
  4. In the new robots.txt.liquid file, ensure you are not blocking important pages. For example:
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Allow: /collections
Allow: /products
Allow: /blogs

Save the changes and your custom robots.txt will be live.

WooCommerce

For WooCommerce, you can edit the robots.txt file directly if you have access to your site’s root directory, or you can use an SEO plugin like Yoast SEO. Here’s how to do it with Yoast SEO:

  1. Install and activate the Yoast SEO plugin.
  2. Go to SEO > Tools in your WordPress dashboard.
  3. Click on File editor.
  4. If you don’t have a robots.txt file, Yoast will allow you to create one. If you do, you can edit it directly.
  5. Ensure your robots.txt file is not blocking important pages. For example:
User-agent: *
Disallow: /wp-admin/
Disallow: /cart/
Disallow: /checkout/
Allow: /product/
Allow: /product-category/
Allow: /blog/

Save the changes and your updated robots.txt file will be live.

By ensuring your robots.txt file is correctly configured, you can help search engines crawl and index the pages that are crucial for your eCommerce SEO strategy.