Managing your Robots.txt file

Managing your Robots.txt file

The Default robots.txt file


As default, shopit has a robots.txt file that reads:
User-agent: *
Allow: /
Disallow: /search
Disallow: /login
Disallow: /account
Disallow: /checkout
Sitemap: https://{{ app.request.host }}/sitemap.xml

You can check this by going to your store url and adding /robots.txt

Creating your own robots.txt file


If you want to create your own robots.txt file, you can do so via the Snippets feature.
  1. Create a snippet called robotstxt - or snippet('robotstxt') 
  2. Add your required code - this may be the requirement of an experienced developer.
  3. You should add the above code to it first (your version will NOT append to the default, it will start a blank new one). The below image shows how to add the line Disallow: /checkout/basket


You can check your new file by going to your store url and adding /robots.txt. Allow for the front end cache to clear (approx 10 min)

Alternately, allow for the front end cache to clear (approx 10 min) or use the Design > Edit Website area (no cache) and change the url






    • Related Articles

    • Creating Forms for your website

      Shopit gives users the ability to create multiple contact forms across your website, using the Messages/Forms feature. We also store your messages in the admin area as a backup to you forwarding them to a nominated third party email address/group. ...