Free Tool Guide
•
5 min read

Control how AI bots like ChatGPT and Claude access your website content.
Try the Tool Now
Free, no signup required
AI companies like OpenAI, Anthropic, and Google are crawling the web to train their models. Just like robots.txt controls search engine crawlers, llms.txt is an emerging standard to control AI crawler access.
llms.txt is a proposed standard for website owners to communicate with AI crawlers. It sits alongside robots.txt in your site's root directory and specifies:
Step 1: Enter Your Website URL
Add your domain name. This appears in the generated file for identification.
Step 2: Select AI Bots to Allow/Block
Toggle which AI crawlers (ChatGPT, Claude, Gemini, etc.) can access your content. Choose based on your preferences.
Step 3: Set Content Preferences
Specify if AI can use your content for training, summarization, or if you require attribution/citations.
Step 4: Download and Install
Download the generated llms.txt file and upload it to your website's root directory (same location as robots.txt).
Upload the file to your website's root directory. It should be accessible at:
https://yourdomain.com/llms.txt
This is the same location where robots.txt lives. Most web hosts let you upload files via FTP, cPanel, or your CMS's file manager.
Ready to control AI access to your site?
Generate your llms.txt file in seconds—free, no signup required.
Open llms.txt Creator