Results 1 to 3 of 3

Thread: Hiccups on transfer of robot.txt file (can't create directory for any page)

  1. #1
    Junior Member
    Join Date
    Apr 2012
    Location
    Mountain Home in Western North Carolina
    Posts
    21

    Default Hiccups on transfer of robot.txt file (can't create directory for any page)

    We just finished populating a 90 page web site. Eash day we would publish it locally to see if there were any diagnostics; there weren't. Then we FTPed it to the host and it seemed to be transferring okay until it began to transfer a "robots.txt" file. We never use ‘robots.txt’ because, for the most part, it tells the spiders what not to do. We never have pages, for example, we don't want spidered. After being stalled about three minutes transferring ‘robots.txt’ it came out with 90 diagnostics (one for each page?) each stating that NOF could not create a directory for that given page. Yet it still publishes, completely and satisfactorily, locally. Finally, to add insult to injury, NOF froze (didn't respond according to Windows) and crashed. When it was restarted NOF said the web site was polluted and 'fixed' it. But the fix wasn't a true fix so we reloaded from one of the backup templates.

    Our first website creaed with NOF ‘Essentials’ was massacred by version 12. That was before we found out that the only way to reliably save a website in NOF is by exporting it to a template. Fortunately we had off-line backups so we finished that website in ‘Essentials’ with no problems. But we thought we should try to get used to version 12 as updated so this newer website was created in version 12's latest update. Bugs that don’t exist in 'Essential' are rampart in version 12 but we worked around them. This problem, however, we don't have a work-around for. I hope someone does.

  2. #2
    Senior Member gotFusion's Avatar
    Join Date
    Jan 2010
    Location
    www.gotHosting.biz
    Posts
    4,529

    Default

    How are you having Fusion do the robots.txt file? The best way is to write it in notepad and add it to Fusion as a file asset.

    See this gotfusion tutorial for details: http://www.gotfusion.com/tutorials/tut.cfm?itemID=10
    NetObjects Fusion Cloud Linux enabled Web Hosting, support + training starts at $14.95
    NetObjects Fusion web Hosting and support + ASP + PHP + ColdFusion + MySQL + MS SQL
    FREE NetObjects Fusion Support & training comes with all web hosting accounts
    NetObjects Fusion Web Hosting: http://www.gotHosting.biz

  3. #3
    Junior Member
    Join Date
    Apr 2012
    Location
    Mountain Home in Western North Carolina
    Posts
    21

    Default

    GoDaddy worked many hours on this problem and could not find the cause. They moved the site from the Linex server to a Windows server and it FTPed up without a hitch. They had me FTP another website to the old Linex server and it did so without any problems. Even so I am going to continue on the Windows server.
    Response to Mike: As I said, we do not create robot files on any wewbsites. They are used primarily to tell the spiders what NOT to do. What you do want them to do they are going to do anyway. Spiders have becxome vert sophisticated. The only external file of any use is the xml formatted site map. But thanks for your response. (If you wish to look at the site in question the url is hiway-guide.com.)

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •