Ensuring Robots.txt Does Not Block Important Pages
In eCommerce SEO, it’s crucial to ensure that your robots.txt
file is not blocking the crawling of pages that need to rank, such as product pages, category pages, and blog posts. Here’s how you can check and update your robots.txt
file for Shopify and WooCommerce.
Shopify
Shopify automatically generates a robots.txt
file for your store. To customize it, you need to create a robots.txt.liquid
file in your theme. Follow these steps:
- Go to your Shopify admin and navigate to Online Store > Themes.
- Click Actions > Edit Code.
- In the Templates section, click Add a new template and select robots.txt from the dropdown menu.
- In the new
robots.txt.liquid
file, ensure you are not blocking important pages. For example:
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Allow: /collections
Allow: /products
Allow: /blogs
Save the changes and your custom robots.txt
will be live.
WooCommerce
For WooCommerce, you can edit the robots.txt
file directly if you have access to your site’s root directory, or you can use an SEO plugin like Yoast SEO. Here’s how to do it with Yoast SEO:
- Install and activate the Yoast SEO plugin.
- Go to SEO > Tools in your WordPress dashboard.
- Click on File editor.
- If you don’t have a
robots.txt
file, Yoast will allow you to create one. If you do, you can edit it directly. - Ensure your
robots.txt
file is not blocking important pages. For example:
User-agent: *
Disallow: /wp-admin/
Disallow: /cart/
Disallow: /checkout/
Allow: /product/
Allow: /product-category/
Allow: /blog/
Save the changes and your updated robots.txt
file will be live.
By ensuring your robots.txt
file is correctly configured, you can help search engines crawl and index the pages that are crucial for your eCommerce SEO strategy.