Can anyone please tell me what is happening:
I am trying to block web crawlers from certain directories.
I create the text file and FTP it to the root, however the following day it is written over with the below
Sitemap: http://www.bacabecopark.com/sitemap.xml
I have confirmed many times that the correct file is in the root only to find that it has been written over.
Can someone please suggest what might be happening
Thanks