We just finished populating a 90 page web site. Eash day we would publish it locally to see if there were any diagnostics; there weren't. Then we FTPed it to the host and it seemed to be transferring okay until it began to transfer a "robots.txt" file. We never use ‘robots.txt’ because, for the most part, it tells the spiders what not to do. We never have pages, for example, we don't want spidered. After being stalled about three minutes transferring ‘robots.txt’ it came out with 90 diagnostics (one for each page?) each stating that NOF could not create a directory for that given page. Yet it still publishes, completely and satisfactorily, locally. Finally, to add insult to injury, NOF froze (didn't respond according to Windows) and crashed. When it was restarted NOF said the web site was polluted and 'fixed' it. But the fix wasn't a true fix so we reloaded from one of the backup templates.
Our first website creaed with NOF ‘Essentials’ was massacred by version 12. That was before we found out that the only way to reliably save a website in NOF is by exporting it to a template. Fortunately we had off-line backups so we finished that website in ‘Essentials’ with no problems. But we thought we should try to get used to version 12 as updated so this newer website was created in version 12's latest update. Bugs that don’t exist in 'Essential' are rampart in version 12 but we worked around them. This problem, however, we don't have a work-around for. I hope someone does.