今天针对ip的检测写出了一段代码:
from urllib import request, error
from bs4 import BeautifulSoup
import re
if __name__ == '__main__':
url = 'http://ip.webmasterhome.cn/'
# 使用代理的步骤
# 1.设置代理IP,进入代理网站选择一个IP:PORT
proxy = {'http': '120.210.219.73:8080'}
# 2.创建ProxyHandler
proxy_handler = request.ProxyHandler(proxy)
# 3.创建Opener
opener = request.build_opener(proxy_handler)
# 4.安装Opener
request.install_opener(opener)
# 现在如果访问url,就会使用代理服务器
try:
req = request.Request(url)
req.add_header('User-Agent','Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.121 Safari/537.36')
rsp = request.urlopen(req)
html = rsp.read()
soup = BeautifulSoup(html, "html.parser")
# print(html)
for each in soup.find_all(class_=re.compile("df_d ipres")):
# print(each.text, "->", ''.join(["https://baike.baidu.com", each["href"]]))
print(each.text)
except error.URLError as e:
print(e)
except error.HTTPError as e:
print(e)
except Exception as e:
print(e)
该检测方法是通过BeautifulSoup的find_all方法利用正则表达式来进行筛选。
另外,建议在网上找代理ip时去那些比较正规的,有一定反爬措施网站,否则就会像我之前一样没有一个可以用的。。。。
来源:https://blog.csdn.net/qq_43574052/article/details/99474846