I am creating a script that crawls one website to gather some data but the problem is that they blocked me after too many requests but using a proxy I can send more request
Here is a quick, creative solution that doesn't require modification of selenium's Options or uploading a file to chromedriver. It makes use of pyautogui (can use any python package that simulates key presses) to enter proxy auth details. It also uses threading to account for chrome authentication popup window that would otherwise pause the script.
import time
from threading import Thread
import pyautogui
from selenium.webdriver.chrome.options import Options
from selenium import webdriver
hostname = "HOST_NAME"
port = "PORT"
proxy_username = "USERNAME"
proxy_password = "PASSWORD"
chrome_options = Options()
chrome_options.add_argument('--proxy-server={}'.format(hostname + ":" + port))
driver = webdriver.Chrome(options=chrome_options)
def enter_proxy_auth(proxy_username, proxy_password):
time.sleep(1)
pyautogui.typewrite(proxy_username)
pyautogui.press('tab')
pyautogui.typewrite(proxy_password)
pyautogui.press('enter')
def open_a_page(driver, url):
driver.get(url)
Thread(target=open_a_page, args=(driver, "http://www.example.com/")).start()
Thread(target=enter_proxy_auth, args=(proxy_username, proxy_password)).start()
NOTE: For any serious project or test suite I would recommend opting for a more robust solution. However, if you are just experimenting and require a quick and effective solution, this is an option.