robots_txt 1.1.1 copy "robots_txt: ^1.1.1" to clipboard
robots_txt: ^1.1.1 copied to clipboard

A lightweight `robots.txt` ruleset parser to ensure your application adheres the de facto standard.

A lightweight robots.txt ruleset parser to ensure your application follows the standard protocol. #

Usage #

The following code gets the robots.txt robot exclusion ruleset of a website.

quietMode determines whether or not the library should print warning messages in the case of the robots.txt not being valid or other errors.

// Create an instance of the `robots.txt` parser
final robots = Robots(host: 'https://github.com/');
// Read the ruleset of the website
await robots.read();

Now that the robots.txt file has been read, we can verify whether we can visit a certain path or not:

final userAgent = '*';
print("Can '$userAgent' visit '/gist/'?");
print(robots.canVisitPath('/gist/', userAgent: '*')); // It cannot
print("Can '$userAgent' visit '/wordcollector/robots_txt'?");
print(robots.canVisitPath('/wordcollector/robots_txt', userAgent: '*')); // It can
3
likes
120
pub points
64%
popularity

Publisher

verified publishervxern.dev

A lightweight `robots.txt` ruleset parser to ensure your application adheres the de facto standard.

Repository (GitHub)
View/report issues

Documentation

API reference

License

MIT (LICENSE)

Dependencies

sprint, web_scraper

More

Packages that depend on robots_txt