Dynamically add to allowed_domains in a Scrapy spider

后端 未结 2 1558
走了就别回头了
走了就别回头了 2021-01-20 12:35

I have a spider that starts with a small list of allowed_domains at the beginning of the spidering. I need to add more domains dynamically to this whitelist as

2条回答
  •  一个人的身影
    2021-01-20 13:04

    You could try something like the following:

    class APSpider(BaseSpider):
    name = "APSpider"
    
    start_urls = [
        "http://www.somedomain.com/list-of-websites",
    ]
    
    def __init__(self):
        self.allowed_domains = None
    
    def parse(self, response):
        soup = BeautifulSoup( response.body )
    
        if not self.allowed_domains:
            for link_tag in soup.findAll('td',{'class':'half-width'}):
                _website = link_tag.find('a')['href']
                u = urlparse.urlparse(_website)
                self.allowed_domains.append(u.netloc)
    
                yield Request(url=_website, callback=self.parse_secondary_site)
    
        if response.url in self.allowed_domains:
            yield Request(...)
    
    ...
    

提交回复
热议问题