Managing your Robots.txt file
The Default robots.txt file
As default, shopit has a robots.txt file that reads:
User-agent: *
Allow: /
Disallow: /search
Disallow: /login
Disallow: /account
Disallow: /checkout
Sitemap: https://{{ app.request.host }}/sitemap.xml
You can check this by going to your store url and adding /robots.txt
Creating your own robots.txt file
If you want to create your own robots.txt file, you can do so via the Snippets feature.
- Create a snippet called robotstxt - or snippet('robotstxt')
- Add your required code - this may be the requirement of an experienced developer.
- You should add the above code to it first (your version will NOT append to the default, it will start a blank new one). The below image shows how to add the line Disallow: /checkout/basket
You can check your new file by going to your store url and adding /robots.txt. Allow for the front end cache to clear (approx 10 min)
Alternately, allow for the front end cache to clear (approx 10 min) or use the Design > Edit Website area (no cache) and change the url
Helpful articles: