Robots class

Stores information about a robots.txt file, exposing a simple and concise API for working with the file and validating if a certain path can be accessed by a given user-agent.

Constructors

Robots.parse(String contents, {Set<String>? onlyApplicableTo})
Parses the contents of a robots.txt file, creating an instance of Robots.
factory

Properties

hashCode int
The hash code for this object.
no setterinherited
hosts List<String>
Stores the preferred domains for a website with multiple mirrors.
final
rulesets List<Ruleset>
Stores information about the rules specified for given user-agents.
final
runtimeType Type
A representation of the runtime type of the object.
no setterinherited
sitemaps List<String>
Stores links to the website's sitemaps.
final

Methods

noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited
verifyCanAccess(String path, {required String userAgent, PrecedentRuleType typePrecedence = PrecedentRuleType.defaultPrecedentType, PrecedenceStrategy comparisonMethod = PrecedenceStrategy.defaultStrategy}) bool
Checks if the robots.txt file allows userAgent to access path.

Operators

operator ==(Object other) bool
The equality operator.
inherited

Static Methods

validate(String contents, {Set<String> allowedFieldNames = const {}}) → void
Taking the contents of robots.txt file, ensures that the file is valid, and throws a FormatException if not.