/Robots.txt Plugin

Robots.txt Plugin

13
v0.2.1

Robots.txt Traefik plugin

Table of Contents

  1. Description
  2. Setup
  3. Usage
  4. Reference
  5. Development
  6. Contributors

Description

Robots.txt is a middleware plugin for Traefik which add rules based on ai.robots.txt or on custom rules in /robots.txt of your website.

Setup

# Static configuration
experimental:
plugins:
robots-txt:
moduleName: github.com/solution-libre/traefik-plugin-robots-txt
version: v0.2.1

Usage

# Dynamic configuration
http:
routers:
my-router:
rule: host(`localhost`)
service: service-foo
entryPoints:
- web
middlewares:
- my-robots-txt
services:
service-foo:
loadBalancer:
servers:
- url: http://127.0.0.1
middlewares:
my-robots-txt:
plugin:
robots-txt:
aiRobotsTxt: true

Reference

NameDescriptionDefault valueExample
aiRobotsTxtEnable the retrieval of ai.robots.txt listfalsetrue
customRulesAdd custom rules at the end of the file\nUser-agent: *\nDisallow: /private/\n
overwriteRemove the original robots.txt file contentfalsetrue

Development

Solution Libre's repositories are open projects, and community contributions are essential for keeping them great.

Fork this repo on GitHub

Contributors

The list of contributors can be found at: https://github.com/solution-libre/traefik-plugin-robots-txt/graphs/contributors