Checking for Disallowed and Noindexed Pages
To ensure that your eCommerce site is optimized for search engines, it’s important to check if any pages are both disallowed in the robots.txt
file and have a noindex
meta tag. This can prevent search engines from crawling and indexing important pages.
For Shopify
- Go to your Shopify admin panel.
- Navigate to Online Store > Preferences.
- Scroll down to the Search engine listing preview section.
- Click on Edit website SEO.
- Check if the
tag is present in the HTML of the page.
- To check the
robots.txt
file, append/robots.txt
to your store’s URL (e.g.,https://yourstore.myshopify.com/robots.txt
). - Look for any disallowed pages in the
robots.txt
file and compare them with the pages that have thenoindex
tag.
For WordPress (WooCommerce)
- Log in to your WordPress admin dashboard.
- Navigate to Settings > Reading.
- Ensure that the Search Engine Visibility option is unchecked.
- To check for
noindex
tags, you can use an SEO plugin like Yoast SEO or Rank Math. Go to the page or post editor and look for the SEO settings box. - Check if the
tag is present.
- To view the
robots.txt
file, append/robots.txt
to your site’s URL (e.g.,https://yoursite.com/robots.txt
). - Look for any disallowed pages in the
robots.txt
file and compare them with the pages that have thenoindex
tag.
By ensuring that no important pages are both disallowed and noindexed, you can improve your site’s visibility and performance in search engine results.