Robot txt is a file that can be placed in the root folder of your website to help search engines better index your site. ... txt robots use something called the Exclusion Protocol. This website will easily create files with entries from pages to be excluded.A robot. txt file tells search engine crawlers which URL crawlers can reach your site. It is mainly used to avoid overloading your site with requests; This is not a mechanism to exclude a Web page from Google. To keep the Web page out of Google, protect the Block Indexing with noindex or password-page.