C# 클래스 Rock.UniversalSearch.Crawler.RobotsTxt.Robots

Provides functionality for parsing a robots.txt file's content and querying the rules and directives inside it.
파일 보기 프로젝트 열기: NewSpring/Rock 1 사용 예제들

공개 메소드들

메소드 설명
CrawlDelay ( string userAgent ) : long

Gets the number of milliseconds to wait between successive requests for this robot.

IsPathAllowed ( string userAgent, string path ) : bool

Checks if the given user-agent can access the given relative path.

Load ( string content ) : Robots

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

Robots ( string content ) : System

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.

비공개 메소드들

메소드 설명
isPathMatch ( string path, string rulePath ) : bool
normalizePath ( string path ) : string
readLines ( string lines ) : void

메소드 상세

CrawlDelay() 공개 메소드

Gets the number of milliseconds to wait between successive requests for this robot.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public CrawlDelay ( string userAgent ) : long
userAgent string User agent string.
리턴 long

IsPathAllowed() 공개 메소드

Checks if the given user-agent can access the given relative path.
Thrown when userAgent parameter is null, /// empty or consists only of white-space characters.
public IsPathAllowed ( string userAgent, string path ) : bool
userAgent string User agent string.
path string Relative path.
리턴 bool

Load() 공개 정적인 메소드

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public static Load ( string content ) : Robots
content string
리턴 Robots

Robots() 공개 메소드

Initializes a new RobotsTxt.Robots instance for the given robots.txt file content.
public Robots ( string content ) : System
content string Content of the robots.txt file.
리턴 System