C# Class Rock.UniversalSearch.Crawler.RobotsTxt.Robots

Provides functionality for parsing a robots.txt file's content and querying the rules and directives inside it.
Afficher le fichier Open project: NewSpring/Rock Class Usage Examples

Méthodes publiques

Méthode Description
CrawlDelay ( string userAgent ) : long

Gets the number of milliseconds to wait between successive requests for this robot.

IsPathAllowed ( string userAgent, string path ) : bool

Checks if the given user-agent can access the given relative path.

Load ( string content ) : Robots

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Robots ( string content ) : System

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Private Methods

Méthode Description
isPathMatch ( string path, string rulePath ) : bool
normalizePath ( string path ) : string
readLines ( string lines ) : void

Method Details

CrawlDelay() public méthode

Gets the number of milliseconds to wait between successive requests for this robot.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public CrawlDelay ( string userAgent ) : long
userAgent string User agent string.
Résultat long

IsPathAllowed() public méthode

Checks if the given user-agent can access the given relative path.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public IsPathAllowed ( string userAgent, string path ) : bool
userAgent string User agent string.
path string Relative path.
Résultat bool

Load() public static méthode

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public static Load ( string content ) : Robots
content string
Résultat Robots

Robots() public méthode

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public Robots ( string content ) : System
content string Content of the robots.txt file.
Résultat System