webdriverwait

Scraping a React-table using Selenium

心已入冬 提交于 2021-01-28 09:01:03
问题 I have written a code for scaping an HTML React table by using python selenium. But it cannot catch values in the table(only DOM elements). Here is the website https://nonfungible.com/market/history/decentraland?filter=saleType%3D&length=10&sort=blockTimestamp%3Ddesc&start=0 Here is my code: from selenium import webdriver dr = webdriver.PhantomJS(r'PATH_TO_PHANTOM/phantomjs-2.1.1-macosx/bin/phantomjs') dr.get("https://nonfungible.com/market/history/decentraland?filter=saleType%3D&length=10

How to get total number of nested frames inside the frame

僤鯓⒐⒋嵵緔 提交于 2021-01-28 08:41:03
问题 Snapshot of Frame_top with 3 nested frames: Code: WebElement topframe = driver.findElement(By.xpath("//frame[@name='frame-top']")); String frame1 = driver.switchTo().frame(topframe).switchTo().frame("frame-left").findElement(By.xpath("//body")).getText(); System.out.println(frame1); List<WebElement> nestedFrames = driver.switchTo().frame(topframe).findElements(By.tagName("frame")); System.out.println(nestedFrames.size()); On top you can see this page has nested frames inside the frame(frame

Click on check box after confirming text in html table which is dynamic

﹥>﹥吖頭↗ 提交于 2021-01-28 08:38:22
问题 I need to click on the check box in the HTML table after asserting the text. Below is the html. <div class="k-widget k-grid "> <div class="k-grid-header" style="padding: 0px 16px 0px 0px;"> <div class="k-grid-header-wrap"> <table> <colgroup> <col width="50px"> <col> <col> </colgroup> <thead> <tr> <th aria-sort="" colspan="1" rowspan="1" class="k-header checkbox-grid-column"><input id="c3c07f7e-5119-4a36-9f67-98fa4d21fa07" type="checkbox" class="k-checkbox"><label class="k-checkbox-label" for=

How to send text to the recovery mail field of https://mail.protonmail.com/create/new?language=en using Selenium and Python

别来无恙 提交于 2021-01-28 08:02:25
问题 So im trying to get my first ProtonMail Account Generator working. My Problem is that selenium wont find either the field for the recovery mail or the Create Account button. I switched to the iframe already. Im pretty new and thought that this problem might be caused by the "new" html document which contains the bottom part (starting with the recovery email). Hope someone can help me. Screenshot from selenium import webdriver import time url = 'https://mail.protonmail.com/create/new?language

I need help finding an element with a locator

不想你离开。 提交于 2021-01-28 07:43:12
问题 I'm trying to get the elements from this xpath here is the code, there is several of these elements I cut the code down to avoid a million lines on here. html <keyword-text class="_nghost-fyp-81"><div class="keyword-text _ngcontent- fyp-81" clickabletooltiptarget="" aria-label=""><span class="keyword _ngcontent-fyp-81" aria-hidden="false">new york new york las vegas</span> <!----></div><!----><!----></keyword-text> xpath keyword_text = self.browser.find_elements_by_xpath("//span[starts-with(

Trying to select an option from the dropdown in Selenium web automation -error- “ElementNotInteractableException: could not be scrolled into view”

做~自己de王妃 提交于 2021-01-28 07:37:30
问题 package com.web.automation; import java.util.concurrent.TimeUnit; import org.openqa.selenium.By; import org.openqa.selenium.JavascriptExecutor; import org.openqa.selenium.WebDriver; import org.openqa.selenium.WebElement; import org.openqa.selenium.firefox.FirefoxDriver; import org.openqa.selenium.support.ui.ExpectedConditions; import org.openqa.selenium.support.ui.Select; import org.openqa.selenium.support.ui.WebDriverWait; import org.testng.annotations.AfterMethod; import org.testng

Selenium not able to click on Get Data button on using Python

时光怂恿深爱的人放手 提交于 2021-01-28 06:22:37
问题 I am scraping data from this website . The element is below and geckodriver <img class="getdata-button" style="float:right;" src="/common/images/btn-get-data.gif" id="get" onclick="document.getElementById('submitMe').click()"> but can't get selenium to click it tried even xpath, id but not luck is there any fix or work around to get it done? 回答1: To click on the element Get Data you can use either of the following Locator Strategies: Using css_selector : driver.find_element_by_css_selector(

Web scraping with selenium and python - xpath with contains text

こ雲淡風輕ζ 提交于 2021-01-28 02:00:22
问题 I will try to make it really short. I am trying to click on a product that came out of a search from a website. Basically there is a list of matching products, and I want to click on the first one which contains the product name I searched in its title. I will post the link of the website so you can inspect its DOM structure: https://www.tonercartuccestampanti.it/#/dfclassic/query=CE285A&query_name=match_and In this case, many contain my query string, and I would simply like to click on the

Unable to access the remaining elements by xpaths in a loop after accessing the first element- Webscraping Selenium Python

本小妞迷上赌 提交于 2021-01-14 23:42:08
问题 Im trying to scrape data from sciencedirect website. Im trying to automate the scarping process by accessing the journal issues one after the other by creating a list of xpaths and looping them. when im running the loop im unable to access the rest of the elements after accessing the first journal. This process worked for me on another website but not on this. I also wanted to know is there any better way to access these elements apart from this process. #Importing libraries import requests

Unable to access the remaining elements by xpaths in a loop after accessing the first element- Webscraping Selenium Python

老子叫甜甜 提交于 2021-01-14 23:40:51
问题 Im trying to scrape data from sciencedirect website. Im trying to automate the scarping process by accessing the journal issues one after the other by creating a list of xpaths and looping them. when im running the loop im unable to access the rest of the elements after accessing the first journal. This process worked for me on another website but not on this. I also wanted to know is there any better way to access these elements apart from this process. #Importing libraries import requests