Scraping table data from multiple links and combine this together in one excel file

妖精的绣舞 提交于 2020-06-28 03:46:12

问题


I have a link, and within that link, I have some products. Within each of these products, there is a table of specifications. The table is such that first column should be the header, and second column the data corresponding to it. The first column for each of these tables is different, with some overlapping categories. I want to get one big table that has all these categories, and in rows, the different products. I am able to get data for one table (one product) as follows:

import requests
import pandas as pd
import xlsxwriter
import csv
from lxml import html
from bs4 import BeautifulSoup
url= "https://www.1800cpap.com/resmed-airfit-n30-nasal-cpap-mask-with-headgear"
source_code= requests.get(url)
plain_text= source_code.text
soup= BeautifulSoup(plain_text, 'html.parser')
table= soup.find("table", {"class":"table"})

print(table)
output_rows=[]
table_rows= table.find_all('tr')
#print(table_rows)

headers = [td.text for td in soup.select_one('.table').select('td:nth-of-type(1)')]
with open("data.csv", "w", encoding="utf-8-sig", newline='') as csv_file:
    w = csv.writer(csv_file, delimiter = ",", quoting=csv.QUOTE_MINIMAL)
    w.writerow(headers)
    for table in soup.select('table'):
        w.writerow([td.text for td in table.select('td:nth-of-type(2)')])

I understand for different products I will have to loop over the link to eac product, and I am able to do that. However, how do I append each table to the previous output such that the required table structure is maintained?


回答1:


import requests
import pandas as pd
from bs4 import BeautifulSoup


url = 'https://www.1800cpap.com/cpap-masks/nasal'

def get_item(url):
    soup = BeautifulSoup(requests.get(url).content, 'html.parser')

    print('Getting {}..'.format(url))

    title = soup.select_one('h1.product-details-full-content-header-title').get_text(strip=True)

    all_data = {'Item Title': title}
    for tr in soup.select('#product-specs-list tr'):
        h, v = [td.get_text(strip=True) for td in tr.select('td')]
        all_data[h.rstrip(':')] = v

    return all_data

all_data = []
for page in range(1, 2):
    print('Page {}...'.format(page))
    soup = BeautifulSoup(requests.get(url, params={'page': page}).content, 'html.parser')

    for a in soup.select('a.facets-item-cell-grid-title'):
        u = 'https://www.1800cpap.com' + a['href']
        all_data.append(get_item(u))

df = pd.DataFrame(all_data)
df.to_csv('data.csv')

Prints:

Page 1...
Getting https://www.1800cpap.com/resmed-airfit-n30-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/dreamwear-nasal-cpap-mask-with-headgear-by-philips-respironics..
Getting https://www.1800cpap.com/eson-2-nasal-cpap-mask-with-headgear-by-fisher-and-paykel..
Getting https://www.1800cpap.com/resmed-mirage-fx-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n30i-nasal-cpap-mask-by-resmed..
Getting https://www.1800cpap.com/dreamwisp-nasal-cpap-mask-fitpack..
Getting https://www.1800cpap.com/respironics-comfortgel-blue-cpap-nasal-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-mirage-fx-for-her-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n20-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/wisp-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/pico-nasal-cpap-mask-with-headgear-by-philips-respironics-2..
Getting https://www.1800cpap.com/airfit-n20-for-her-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/airfit-f10-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/fisher-and-paykel-zest-q-nasal-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-swift-fx-nano-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/resmed-ultra-mirage-2-nasal-cpap-mask..
Getting https://www.1800cpap.com/airfit-n10-for-her-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/eson-nasal-cpap-mask-by-fisher-and-paykel..
Getting https://www.1800cpap.com/resmed-swift-fx-nano-nasal-cpap-mask-for-her-with-headgear..
Getting https://www.1800cpap.com/mirage-activa-lt-cpap-mask-by-resmed..
Getting https://www.1800cpap.com/resmed-mirage-micro-cpap-mask..
Getting https://www.1800cpap.com/phillips-respironics-trueblue-nasal-cpap-mask-with-headgear..
Getting https://www.1800cpap.com/fisher-paykel-zest-cpap-mask..
Getting https://www.1800cpap.com/viva-nasal-cpap-mask-by-3b-medical..

And saves data.csv (screenshot from LibreOffice):



来源:https://stackoverflow.com/questions/62588205/scraping-table-data-from-multiple-links-and-combine-this-together-in-one-excel-f

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!