How to Correctly Setup & Configure Robots.txt In OpenCart?
Robots.txt is a text file that helps search engines like Google, Bing to understand what information on a particular site needed to be indexed. You can read more about robots.txt by clicking this link. Robots.txt is a critical file for the success of any store.
Why you need the Robots.txt file?
Robots.txt file should be placed in the root directory of your website in order to tell the search engines which pages to skip and which to index. Webmasters use robots.txt files to help search engines index the content of their websites. With the help of this file, webmasters can tell the search engine spiders not to crawl the pages that they do not consider important enough to be crawled, such as pdf files, printable version of pages and much more. In this way, they get a better opportunity to have important pages featured in search engine result pages. In other words,
Improving performance Using Robots.txt in OpenCart
There are certain areas where Robots.txt file can help, we are listing the 2 primary reasons
1. Robots.txt will help prevent the duplicate content issue, one of the primary things for SEO success.
2. Robots.txt also help you to hide technical details about your site i.e. Error logs, SVN files, wanted directories etc. Since these are prevented by Robots.txt you are left with clean URLs to be indexed in search engines.
Set Up Robots.txt in OpenCart
Before you setup Robots.txt file, you should know that robots.txt settings will only cover 1 domain at a time, so for multiple stores, you have to create separate robots.txt files for each store. Creating Robots.txt is super simple since it’s nothing but a text file and can be created using any text editors like Dreamweaver, notepad, vim or your favorite code editor.
Once you have created Robots.txt file it is supposed to reside at the root of your site. For an example, if your store domain is www.mystore.com you should put
Robots.txt for OpenCart
Following is a well-tested version of
User-agent: * Disallow: /*&limit Disallow: /*?limit Disallow: /*?sort Disallow: /*&sort Disallow: /*?order Disallow: /*&order Disallow: /*?price Disallow: /*&price Disallow: /*?brand_tabletpc Disallow: /*&brand_tabletpc Disallow: /*?color_default Disallow: /*&color_default Disallow: /*?filter_tag Disallow: /*&filter_tag Disallow: /*?mode Disallow: /*&mode Disallow: /*?cat Disallow: /*&cat Disallow: /*?dir Disallow: /*&dir Disallow: /*?color Disallow: /*&color Disallow: /*?product_id Disallow: /*&product_id Disallow: /*?minprice Disallow: /*&minprice Disallow: /*?maxprice Disallow: /*&maxprice Disallow: /*?route=checkout/ Disallow: /*?route=account/ Disallow: /*?route=product/search Disallow: /*?page=1 Disallow: /*&create=1 Disallow: /?route=information/contact Disallow: /*?route=affiliate/ Disallow: /*?keyword Disallow: /*?av Disallow: /admin/ Disallow: /system/ Disallow: /catalog/ Sitemap: http://www.mystore.com/index.php?route=feed/google_sitemap
I hope the above tutorial will help you define what needs to be indexed or what to be avoided in Google search results. Note that displaying sitemap path in robots.txt is also a nice idea. Although, the robots.txt file is pretty generic for everyone but you can still fine tune and ignore few rules which are not applicable for your store.
Let us know if you have any questions or face any difficulty setting up the robots.txt file in your store running on OpenCart.