robots_txt library
Lightweight, fully documented robots.txt
file parser.
Classes
- Robots
-
Stores information about a
robots.txt
file, exposing a simple and concise API for working with the file and validating if a certain path can be accessed by a given user-agent. - Rule
-
A single rule (either
Allow
orDisallow
) inside arobots.txt
file. - Ruleset
-
A collection of
Rule
s applicable to a particular userAgent.
Enums
- FieldType
-
Defines a key-value field of a
robots.txt
file specifying a rule. - PrecedenceStrategy
-
Defines the strategy to use to compare rules as per their
precedence
. - PrecedentRuleType
- Defines the method used to decide whether rules that explicitly allow a user-agent to access a path take precedence over ones that disallow it to do so, or the other way around.
Extensions
-
FindRule
on List<
Rule> -
Extends
List<Rule>
with methods used to find rule that pertain to a certain path. -
FindRuleInRuleset
on List<
Ruleset> -
Extends
List<Ruleset>
with a method used to find a rule that matches the supplied filters. - Precedence on Rule?
-
Extends
Rule?
with a getterprecedence
to avoid having to explicitly default to-1
whenever attempting to access the hidden property_precedence
on a nullish value.