Python selenium screen capture not getting whole page

风格不统一 提交于 2019-12-18 03:45:38

问题


I am trying to create a generic webcrawler that will go to a site and take a screenshot. I am using Python, Selnium, and PhantomJS. The problem is that the screenshot is not capturing all the images on a page. For example, if I go to you tube, it doesn't capture images below the main page image. (I don't have high enough rep to post screen shot) I think this may have something to do with dynamic content, but I have tried the wait functions such as implicitly wait and on set_page_load_timeout methods. Because this is a generic crawler I can't wait for a specific event (I want to crawl hundreds of sites).

Is it possible to create a generic webcrawler that can do the screen capture I am trying to do? Code I am using is:

phantom = webdriver.PhantomJS()
phantom.set_page_load_timeout(30)
phantom.get(response.url)
img = phantom.get_screenshot_as_png() #64-bit encoded string
phantom.quit

Here is the image


回答1:


Your suggestion solved the problem. Used the following code (stolen in part from answer to another question):

driver = webdriver.PhantomJS()    
driver.maximize_window()
driver.get('http://youtube.com')  
scheight = .1
while scheight < 9.9:
    driver.execute_script("window.scrollTo(0, document.body.scrollHeight/%s);" % scheight)
    scheight += .01        
driver.save_screenshot('screenshot.png')


来源:https://stackoverflow.com/questions/26211056/python-selenium-screen-capture-not-getting-whole-page

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!