Realtors Should Check Their Robots.txt File and Hear What Google Just Said

John Mueller

Large and small real estate websites and blogs have listings that are already sold or outdated. Many of them use robots.txt files to tell Google, Bing and Yahoo what to crawl and when to craw. Two day's ago Google's John Mueller made some clarification on Robots.txt file that is good to know so you don't hurt your listings' rankings in search engine by mistake.

Some websites generate robots.txt file dynamically, others generate them statically. Realtors, if your website is managed by a webmaster or a company, ask them how is your site's robots.txt file generated. If it's generated dynamically, which may cause frequent changes, here is what Google says about the risk of doing that.

In a StackExchange discussion Google's John Mueller said that robots.txt file should be generated statically and updated by hand.

Here is what John wrote.

"Making the robots.txt file dynamic (for the same host! Doing this for separate hosts is essentially just a normal robots.txt file for each of them.) would likely cause problems: it's not crawled every time a URL is crawled from the site, so it can happen that the "wrong" version is cached. For example, if you make your robots.txt file block crawling during business hours, it's possible that it's cached then, and followed for a day -- meaning nothing gets crawled (or alternately, cached when crawling is allowed). Google crawls the robots.txt file about once a day for most sites, for example."

If this sounds too technical, you can share this information with your webmaster and he or she will know what to do or what to change in case anything needs to be changed. Because if it's not done the right way your property listings may have low visibility in Google's search engine results and home buyers will potentially not see houses that you have listed for sale.

Add new comment