+ Start a Discussion
sam sammsam samm 

ROBOT.txt Help! URGENT

Hi,

i have a requirement in which i need to give permission to only one page in the robot.txt rather then everything, also i dont want to give permission to the static resources, custom setting and everything else.

ex: i want to give permission to only one page myhomepage.page only



VikashVikash (Salesforce Developers) 
Hi Sam,

I believe the permission is not required for uploading robot.txt on a site.
Please refer this link:
https://developer.salesforce.com/forums/?id=906F000000099cdIAA
http://www.computerhope.com/jargon/r/robotstx.htm
http://www.salesforce.com/robots.txt

Thanks
Vikash_SFDC
sam sammsam samm
Hi Vikas,

I am not having problem in creating robot.txt below you can see my code in the robot.txt

User-agent: *
Disallow: 

My question is this is allowing everything rather then just one page.
MissedCallMissedCall
Create a visualforce page like below and include the pages that you dont want to be indexed in Disallow

<apex:page contentType="text/plain" showHeader="false">
    User-agent: *
    Disallow: /yourpagename1
    Disallow: /yourpagename2
</apex:page>
sam sammsam samm
what about static resources?
sam sammsam samm
i have more than hundred pages, is it possible i can use /* something like this
MissedCallMissedCall
This is to exclude all the robots

User-agent: *
Disallow: /

PS : you need a separate "Disallow" line for every URL prefix you want to exclude.

Good Luck!