I want to disallow robots from crawling any folder/subfolder.
I want to disallow the ff:
http://example.com/staging/
http://example.com/test/
And this is the code inside my robots.txt
User-agent: *
Disallow: /staging/
Disallow: /test/
Is this right? and will it work?
Comments
Yes, it is right ! You have to add the command Disallow line by line to each path.
Like this:
A good trick is to use some Robot.txt Generator. Another tip is test your Robot.txt using this Google Tool
Add Comment