Fri, 29 Mar 2024 04:11:02 EDT

Information for RPM perl-WWW-RobotRules-6.02-1.ru.src.rpm

ID26485
Nameperl-WWW-RobotRules
Version6.02
Release1.ru
Epoch
Archsrc
SummaryWWW-RobotRules - database of robots.txt-derived permissions
DescriptionThis module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts. The following methods are provided: =over 4 =item $rules = WWW::RobotRules->new($robot_name) This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot. =item $rules->parse($robot_txt_url, $content, $fresh_until) The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file. =item $rules->allowed($uri) Returns TRUE if this robot is allowed to retrieve this URL. =item $rules->agent([$name]) Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache. =back
Build Time2015-04-06 16:54:53 GMT
Size14730
Payload Hash315ff7f962f3b5bdc52220eea56c711b
Buildrootcentos5-rutgers-staging-build-4596-6905
Provides No Provides
Requires
rpmlib(CompressedFileNames) <= 3.0.4-1
Obsoletes No Obsoletes
Conflicts No Conflicts
Files
1 through 2 of 2
Name ascending sort Size
WWW-RobotRules-6.02.tar.gz9059
perl-WWW-RobotRules.spec4666
Component of No Buildroots