How to disallow multiple folder in robots.txt

I want to disallow robots from crawling any folder/subfolder.

I want to disallow the ff:

http://example.com/staging/
http://example.com/test/

And this is the code inside my robots.txt

User-agent: *
Disallow: /staging/
Disallow: /test/

Is this right? and will it work?

Comments


  • Charles

    Yes, it is right ! You have to add the command Disallow line by line to each path.

    Like this:

    User-agent: *
    Disallow: /cgi-bin/
    Disallow: /img/
    Disallow: /docs/
    

    A good trick is to use some Robot.txt Generator. Another tip is test your Robot.txt using this Google Tool

Add Comment