Robots.txt is a middleware plugin for Traefik which add rules based on
ai.robots.txt or on custom rules in /robots.txt of your website.
# Static configurationexperimental:plugins:robots-txt:moduleName: github.com/solution-libre/traefik-plugin-robots-txtversion: v0.2.1
# Dynamic configurationhttp:routers:my-router:rule: host(`localhost`)service: service-fooentryPoints:- webmiddlewares:- my-robots-txtservices:service-foo:loadBalancer:servers:- url: http://127.0.0.1middlewares:my-robots-txt:plugin:robots-txt:aiRobotsTxt: true
| Name | Description | Default value | Example |
|---|---|---|---|
| aiRobotsTxt | Enable the retrieval of ai.robots.txt list | false | true |
| customRules | Add custom rules at the end of the file | \nUser-agent: *\nDisallow: /private/\n | |
| overwrite | Remove the original robots.txt file content | false | true |
Solution Libre's repositories are open projects, and community contributions are essential for keeping them great.
The list of contributors can be found at: https://github.com/solution-libre/traefik-plugin-robots-txt/graphs/contributors