fbpx

Disallowed and Noindexed Pages

Checking for Disallowed and Noindexed Pages

To ensure that your eCommerce site is optimized for search engines, it’s important to check if any pages are both disallowed in the robots.txt file and have a noindex meta tag. This can prevent search engines from crawling and indexing important pages.

For Shopify

  1. Go to your Shopify admin panel.
  2. Navigate to Online Store > Preferences.
  3. Scroll down to the Search engine listing preview section.
  4. Click on Edit website SEO.
  5. Check if the tag is present in the HTML of the page.
  6. To check the robots.txt file, append /robots.txt to your store’s URL (e.g., https://yourstore.myshopify.com/robots.txt).
  7. Look for any disallowed pages in the robots.txt file and compare them with the pages that have the noindex tag.

For WordPress (WooCommerce)

  1. Log in to your WordPress admin dashboard.
  2. Navigate to Settings > Reading.
  3. Ensure that the Search Engine Visibility option is unchecked.
  4. To check for noindex tags, you can use an SEO plugin like Yoast SEO or Rank Math. Go to the page or post editor and look for the SEO settings box.
  5. Check if the tag is present.
  6. To view the robots.txt file, append /robots.txt to your site’s URL (e.g., https://yoursite.com/robots.txt).
  7. Look for any disallowed pages in the robots.txt file and compare them with the pages that have the noindex tag.

By ensuring that no important pages are both disallowed and noindexed, you can improve your site’s visibility and performance in search engine results.