Method | Description | |
---|---|---|
CrawlDelay ( string userAgent ) : long |
Gets the number of milliseconds to wait between successive requests for this robot.
|
|
IsPathAllowed ( string userAgent, string path ) : bool |
Checks if the given user-agent can access the given relative path.
|
|
Load ( string content ) : |
Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
|
|
Robots ( string content ) : System |
Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
|
Method | Description | |
---|---|---|
isPathMatch ( string path, string rulePath ) : bool | ||
normalizePath ( string path ) : string | ||
readLines ( string lines ) : void |
public CrawlDelay ( string userAgent ) : long | ||
userAgent | string | User agent string. |
return | long |
public IsPathAllowed ( string userAgent, string path ) : bool | ||
userAgent | string | User agent string. |
path | string | Relative path. |
return | bool |
public static Load ( string content ) : |
||
content | string | |
return |
public Robots ( string content ) : System | ||
content | string | Content of the robots.txt file. |
return | System |