calling controller(crawler4j-3.5) inside loop

会有一股神秘感。 提交于 2019-12-19 10:44:12

问题


Hi I am calling controller inside for-loop, because I am having more than 100 url, so I am having all in list and I will iterate and crawl the page, I set that url for setCustomData also, because it should not leave the domain.

for (Iterator<String> iterator = ifList.listIterator(); iterator.hasNext();) {
    String str = iterator.next();
    System.out.println("cheking"+str);
    CrawlController controller = new CrawlController(config, pageFetcher,
        robotstxtServer);
    controller.setCustomData(str);
    controller.addSeed(str);
    controller.startNonBlocking(BasicCrawler.class, numberOfCrawlers);
    controller.waitUntilFinish();
}

but if I run above code, after 1st url crawled perfectly after that 2nd url getting started and printing error like below.

50982 [main] INFO edu.uci.ics.crawler4j.crawler.CrawlController  - Crawler 1 started.
51982 [Crawler 1] DEBUG org.apache.http.impl.conn.PoolingClientConnectionManager  - Connection request: [route: {}->http://www.connectzone.in][total kept alive: 0; route allocated: 0 of 100; total allocated: 0 of 100]
60985 [Thread-2] INFO edu.uci.ics.crawler4j.crawler.CrawlController  - It looks like no thread is working, waiting for 10 seconds to make sure...
70986 [Thread-2] INFO edu.uci.ics.crawler4j.crawler.CrawlController  - No thread is working and no more URLs are in queue waiting for another 10 seconds to make sure...
80986 [Thread-2] INFO edu.uci.ics.crawler4j.crawler.CrawlController  - All of the crawlers are stopped. Finishing the process...
80987 [Thread-2] INFO edu.uci.ics.crawler4j.crawler.CrawlController  - Waiting for 10 seconds before final clean up...
91050 [Thread-2] DEBUG org.apache.http.impl.conn.PoolingClientConnectionManager  - Connection manager is shutting down
91051 [Thread-2] DEBUG org.apache.http.impl.conn.PoolingClientConnectionManager  - Connection manager shut down

please help me to solve the above solution, my interating to start and run the controller inside loop, because I am having lot of url in list.

NOTE: **I am using **crawler4j-3.5.jar and their dependencies.


回答1:


Try:

for(String url : urls) {
    controller.addSeed(url);
}

and override shouldVisit(WebUrl) so that it can't leave the domains.



来源:https://stackoverflow.com/questions/30323522/calling-controllercrawler4j-3-5-inside-loop

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!