Source: ai.robots.txt
Section: web
Priority: optional
Maintainer: Daniel Baumann <daniel@debian.org>
Build-Depends:
 debhelper-compat (= 13),
 jq,
Rules-Requires-Root: no
Standards-Version: 4.7.2
Homepage: https://github.com/ai-robots-txt/ai.robots.txt
Vcs-Browser: https://forgejo.debian.net/web/ai.robots.txt
Vcs-Git: https://forgejo.debian.net/web/ai.robots.txt

Package: apache2-ai-bots
Section: web
Architecture: all
Depends:
 apache2,
 ${misc:Depends},
Description: list of AI agents and robots to block (apache2)
 ai.robots.txt is a list containing AI-related crawlers of all types, regardless
 of purpose.
 .
 Blocking access based on the user agent does not block all crawlers, but it is
 a simple and low overhead way of blocking most crawlers.
 .
 This package contains the apache2 integration,
 please see /usr/share/doc/apache2-ai-bots/README.Debian on how to enable it.
