Create a robots.txt file to control how search engines crawl your website.
This free tool helps you create a robots.txt file for your website. A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site.
The robots.txt file is part of the Robots Exclusion Protocol (REP), a group of web standards that regulate how robots crawl the web, access and index content, and serve that content up to users. The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-wide instructions for how search engines should treat links (such as "follow" or "nofollow").
This tool operates entirely in your browser. We do not store your settings or the generated robots.txt content on our servers.