Smartcrawl robots.txt shows 404 page not found

I had a robots.txt file in the site and Smartcrawl showed:

“We’ve detected an existing robots.txt file that we are unable to edit. You will need to remove it before you can enable this feature.”

Then I have deleted this file and check the admin again, now I can see the robots.txt editor. But when I check the robots.txt url it shows 404 page not found error.

  • PowerQuest
    • Code Wrangler

    Hey Wayne

    I had the exact same issue as you…

    Just rename the robots.txt file you have in your root to sometgin you like for example robots.old (or whatever), (or simply just delete it if you don’t like to keep a copy of it).
    After that just reload Smartcrawl and it will work fine. At least it did it for me. Take the contents of your old robots.txt file and paste it into the robots txt Smartcrawl section and you’re done! :thumbsup:

    Kind regards
    PowerQuest

  • Alessandro
    • Nightcrawler & Daydreamer

    Hello Wayne.

    I checked your website and I confirm that there is an issue with robots.txt deliver ability.

    Smartcrawl stores robots.txt configuration to database and serves the content dynamically when /robots.txt is requested. If a robots.txt file physically exists on the server the contents of the file are being served.
    Smartcrawl is NOT editing directly the file.

    In your case, while robots.txt does not exists the Smartcrawl’s robots configuration should be served. Unfortunately this doesn’t work because the server handles the request, looks for the file and trigger a 404 error (as file do not exists in the directory) before running any WordPress hook (which would serve dynamically the file). As a result, you get a not found error and no robots file.

    At this time, this behavior is probably, server related and additional server configuration should be applied to meditate the issue.

    As a workaround we recommend to create manually create a robots.txt following our suggestions here:
    https://wpmudev.com/docs/wpmu-dev-plugins/smartcrawl/#Robots-txt

    Let us know if you need further support or help with your robots.txt file.

    Kind regards,
    Alex.

  • Joe
    • Designer

    hi thnk you for your assist…

    I think i will just have to leave this for the time being as really consuming too much of my time as this site is not as important… when times come i will migrate to wpmu and hope this can be solved then. TQ