Free Scaling Retail Business Guide

4 Useful Hints About robots.txt File For CS-Cart Stores

The robots.txt file is a file telling the crawling bots how to scan your website: what to crawl and what to ignore. Crawlers influence on how you may be ranked by the search engines. So, a proper completing of that file is all about SEO good practice to rank higher in search. How to enhance this tiny file to make it work for you? Here is a cheat sheet list from our tech guys.

#1 – Sitemap

Indicate the sitemap in your robots.txt file to make the site structure visible for bots to crawl and index your site better. Using the sitemap directive tells search engines where to find your XML sitemap.

#2 – Crawl Delay

Indicate the crawl delay intervals to define how frequently bots can scan your website. If the crawling period is too short, then it may overload your website making troubles for real customers to interact with your content.

#3 – Service Pages

Disallow indexing non-relevant service pages like printout information, for example. It makes sense to allow content and product pages. 

#4 – Dispatch=debugger 

Draw attention to *dispatch=debugger* agent as the default Cs-Cart’s robots.txt file does not configure that properly making visible some sensitive data (including developers’ credentials). Preventing bots from checking out your private folders will make them much harder to find and index.

Extra hints:

  • Ban bad bots! Learn more about bad bots from here.
  • Check the bot-to-visitor ratio to know how attractive your online store is in the eyes of the real people visiting your website. I the number of bots prevails, it should give you a pause of thought.

Closing

Adopt the best practices and rank higher with CS⁠-⁠Cart! Below is an example of a properly configured custom robots.txt file drawn specifically for the CS⁠-⁠Cart platform. Note that this file still needs to be fine-tuned to fit your website.

And if you don’t want to tackle with tech issues, don’t hesitate to contact our developers!

Share: