C# Класс Rock.UniversalSearch.Crawler.RobotsTxt.Robots

Provides functionality for parsing a robots.txt file's content and querying the rules and directives inside it.
Показать файл Открыть проект Примеры использования класса

Открытые методы

Метод Описание
CrawlDelay ( string userAgent ) : long

Gets the number of milliseconds to wait between successive requests for this robot.

IsPathAllowed ( string userAgent, string path ) : bool

Checks if the given user-agent can access the given relative path.

Load ( string content ) : Robots

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Robots ( string content ) : System

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Приватные методы

Метод Описание
isPathMatch ( string path, string rulePath ) : bool
normalizePath ( string path ) : string
readLines ( string lines ) : void

Описание методов

CrawlDelay() публичный Метод

Gets the number of milliseconds to wait between successive requests for this robot.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public CrawlDelay ( string userAgent ) : long
userAgent string User agent string.
Результат long

IsPathAllowed() публичный Метод

Checks if the given user-agent can access the given relative path.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public IsPathAllowed ( string userAgent, string path ) : bool
userAgent string User agent string.
path string Relative path.
Результат bool

Load() публичный статический Метод

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public static Load ( string content ) : Robots
content string
Результат Robots

Robots() публичный Метод

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public Robots ( string content ) : System
content string Content of the robots.txt file.
Результат System