Generate custom robots.txt files to control search engine crawling on your website.
This will be used for the Sitemap directive
The robots.txt file tells search engine crawlers which pages or files they can or cannot request from your site.
Admin areas, login pages, temporary files, duplicate content.
Public content, CSS/JS files (for proper rendering).
Always test robots.txt in Google Search Console before deploying.