Robots class
Stores information about a robots.txt
file, exposing a simple and concise
API for working with the file and validating if a certain path can be
accessed by a given user-agent.
Constructors
-
Robots.parse(String contents, {Set<
String> ? onlyApplicableTo}) -
Parses the contents of a
robots.txt
file, creating an instance ofRobots
.factory
Properties
- hashCode → int
-
The hash code for this object.
no setterinherited
-
hosts
→ List<
String> -
Stores the preferred domains for a website with multiple mirrors.
final
-
rulesets
→ List<
Ruleset> -
Stores information about the rules specified for given user-agents.
final
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
-
sitemaps
→ List<
String> -
Stores links to the website's sitemaps.
final
Methods
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
-
verifyCanAccess(
String path, {required String userAgent, PrecedentRuleType typePrecedence = PrecedentRuleType.defaultPrecedentType, PrecedenceStrategy comparisonMethod = PrecedenceStrategy.defaultStrategy}) → bool -
Checks if the
robots.txt
file allowsuserAgent
to accesspath
.
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited