服务器禁用爬虫

2022-09-26 17:05 By "Powerless" 2694 0 3

【Nginx禁止爬虫访问的方法】

if ($http_user_agent ~* "Scrapy|Baiduspider|Curl|HttpClient|Bytespider|FeedDemon|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser
|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|YisouSpider|HttpClient|MJ12bot|heritrix|EasouSp
ider|Ezooms|^$"){
    return 403;
}

如需跳转其他页面,只需要吧return 403 换成对于的地址即可,配置如下:

if ($http_user_agent ~* "Scrapy|Baiduspider|Curl|HttpClient|Bytespider|FeedDemon|JikeSpider|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser
|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|YisouSpider|HttpClient|MJ12bot|heritrix|EasouSp
ider|Ezooms|^$") {
    return 301 https://yoursite.com;
}

如需禁止特定来源用户,配置如下:

if ($http_referer ~ "baidu\.com|google\.net|bing\.com")  {
  return 403;
}

如需仅允许GET,HEAD和POST请求,配置如下:

#fbrbidden not GET|HEAD|POST method access
if ($request_method !~ ^(GET|HEAD|POST)$) {
        return 403;
}


【Apache禁用爬虫的配置】

mod_rewrite模块确定开启的前提下,在.htaccess文件或者相应的.conf文件,添加以下内容:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (^$|FeedDemon|Indy Library|Alexa Toolbar|AskTbFXTV|AhrefsBot|CrawlDaddy|CoolpadWebkit|Java|Feedly|UniversalFeedParser|ApacheBench|Microsoft URL Control|Swiftbot|ZmEu|oBot|jaunty|Python-urllib|lightDeckReports Bot|YYSpider|DigExt|HttpClient|MJ12bot|heritrix|EasouSpider|Ezooms) [NC]
RewriteRule . - [R=403,L]


评 论

View in WeChat

Others Discussion

  • MySQL分组
    Posted on 2019-11-18 14:00
  • PHP7不兼容性
    Posted on 2018-03-07 15:59
  • 一些常见的基础概念
    Posted on 2018-11-28 19:10
  • HTTP头中隐藏PHP版本号
    Posted on 2021-01-11 16:38
  • QPS、TPS、RT、吞吐量到底是什么
    Posted on 2020-02-02 01:15
  • MySQL事务介绍
    Posted on 2019-06-05 18:14
  • 能创建多少个 TCP 连接?
    Posted on 2021-08-02 16:00
  • MySQL中的行级锁,表级锁,页级锁
    Posted on 2018-08-25 11:00