Python lxml/beautiful soup to find all links on a web page

独自空忆成欢 提交于 2020-01-01 09:34:29

问题


I am writing a script to read a web page, and build a database of links that matches a certain criteria. Right now I am stuck with lxml and understanding how to grab all the <a href>'s from the html...

result = self._openurl(self.mainurl)
content = result.read()
html = lxml.html.fromstring(content)
print lxml.html.find_rel_links(html,'href')

回答1:


Use XPath. Something like (can't test from here):

urls = html.xpath('//a/@href')



回答2:


With iterlinks, lxml provides an excellent function for this task.

This yields (element, attribute, link, pos) for every link [...] in an action, archive, background, cite, classid, codebase, data, href, longdesc, profile, src, usemap, dynsrc, or lowsrc attribute.




回答3:


I want to provide an alternative lxml-based solution.

The solution uses the function provided in lxml.cssselect

    import urllib
    import lxml.html
    from lxml.cssselect import CSSSelector
    connection = urllib.urlopen('http://www.yourTargetURL/')
    dom =  lxml.html.fromstring(connection.read())
    selAnchor = CSSSelector('a')
    foundElements = selAnchor(dom)
    print [e.get('href') for e in foundElements]



回答4:


You can use this method:

from urllib.parse import urljoin, urlparse
from lxml import html as lh
class Crawler:
     def __init__(self, start_url):
         self.start_url = start_url
         self.base_url = f'{urlparse(self.start_url).scheme}://{urlparse(self.start_url).netloc}'
         self.visited_urls = set()

     def fetch_urls(self, html):
         urls = []
         dom = lh.fromstring(html)
         for href in dom.xpath('//a/@href'):
              url = urljoin(self.base_url, href)
              if url not in self.visited_urls and url.startswith(self.base_url):
                   urls.append(url)
         return urls


来源:https://stackoverflow.com/questions/6131089/python-lxml-beautiful-soup-to-find-all-links-on-a-web-page

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!