Copy link to clipboard
Copied
Hi there,
I´ve just built a website in Muse and when I try to upload / test the sitemap.xml (.xml generated by Muse) in google/webmaster tools , it shows up the following message:
¨Unable to access the network: Unable to access robots.txt. We were unable to crawl your sitemap because we were unable to download the robots.txt file from the root of your site. Make sure it is accessible or eliminate it completely.¨
Initially there was no robots.txt on the server and Muse doesn´t automatically generate one. I tried to create one with the following parameters, still the same message:
User-agent: *
Disallow:
Now there´s no robots.txt file on the server.
You can access my .xml file here: http://cristiantomafilmmaker.com/sitemap.xml
Does anyone know why this error?
Thank you.
At the moment there is no robots.txt file on your server for Google bots to find.
http://cristiantomafilmmaker.com/robots.txt
Muse cannot generate a robots.txt file for you. You need to generate one separately and manually FTP it to the root of your server. See link below.
Copy link to clipboard
Copied
At the moment there is no robots.txt file on your server for Google bots to find.
http://cristiantomafilmmaker.com/robots.txt
Muse cannot generate a robots.txt file for you. You need to generate one separately and manually FTP it to the root of your server. See link below.
Create a robots.txt file - Search Console Help