C# Class Rock.UniversalSearch.Crawler.RobotsTxt.Robots

Provides functionality for parsing a robots.txt file's content and querying the rules and directives inside it.
Show file Open project: NewSpring/Rock Class Usage Examples

Public Methods

Method Description
CrawlDelay ( string userAgent ) : long

Gets the number of milliseconds to wait between successive requests for this robot.

IsPathAllowed ( string userAgent, string path ) : bool

Checks if the given user-agent can access the given relative path.

Load ( string content ) : Robots

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Robots ( string content ) : System

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Private Methods

Method Description
isPathMatch ( string path, string rulePath ) : bool
normalizePath ( string path ) : string
readLines ( string lines ) : void

Method Details

CrawlDelay() public method

Gets the number of milliseconds to wait between successive requests for this robot.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public CrawlDelay ( string userAgent ) : long
userAgent string User agent string.
return long

IsPathAllowed() public method

Checks if the given user-agent can access the given relative path.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public IsPathAllowed ( string userAgent, string path ) : bool
userAgent string User agent string.
path string Relative path.
return bool

Load() public static method

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public static Load ( string content ) : Robots
content string
return Robots

Robots() public method

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public Robots ( string content ) : System
content string Content of the robots.txt file.
return System