I’ve generated a Lighthose report on a web app (still work in progress) and in the SEO part of the report, there are some issues that I don’t know how to fix.
The errors reported:
“robots.txt is not valid”
“syntax not understood” on all HTML tags
It’s a React app so there is very little HTML, but it seems that something is not right in the way the HTML file is being read.
I’ve created a sitemap.xml and a robots.txt file and tried to follow debugging advice from various online sources, but none this makes any difference. The robots.txt file is still not valid and the syntax errors persist.
In case anyone ever runs into the same problem, here is what finally solved the problem:
In the public folder of your project, you’ll need the following files:
_redirects
When deploying SPA’s with client side routing to Netlify, add this code snippet to ensure that the navigation works correctly: /* /index.html 200
And, to make sure that the robots.txt file is not treated as client side routing, add this code snippet too: /robots.txt /public/robots.txt 200
_headers
Add this code snippet:
/public/robots.txt
Content-Type: text/plain
robots.txt
Generate a robots.txt file including a url to a sitemap file (can be done online using a robots.txt file generator) and add the code here
Without these steps, the robots.txt file in my app was read as HTML, the redirect to index.html kicked in and the Lighthouse report warned about syntax errors in the robots.txt file