Understanding Your Robots.txt File

  • Updated

While the Robots.txt file is not a part of the Showcase IDX plugin itself, it's important to understand the impact it can have on your site along with your Sitemap XML file which is detailed further here. The purpose of a Sitemap is to indicate to the search engines all the pages that it should crawl on your website.

A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. Basically, it instructs search engines like Google to focus on the parts of your site that benefit you the most in indexing (listings, community pages) while ignoring parts that wouldn't (admin pages, diagnostics pages, etc). We recommend that you not block Showcase IDX content from your robots.txt file.

This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. More details on the purpose of a Robots.txt file are in Google's documentation here.

As a user with a WordPress site, by default, your site has its own robots.txt file that search engines will pick up. More details on this kind of setting are here. You have a few options if you need to add additional customizations to your robots.txt file.

  • You can create a custom robots.txt file and upload it to your root directory via SFTP.
  • You can use a plugin specifically designed to edit your robots.txt file, like Robots.txt Editor.
  • Plugins designed to improve SEO (like Yoast or Rank Math SEO) also include sections for editing your robots.txt file.

WordPress.com only allows certain plans to edit this file, such as Business or eCommerce.