Split scrapy's large CSV file

孤街浪徒 提交于 2020-08-23 12:16:40

问题


Is it possible to make scrapy write to CSV files with not more than 5000 rows in each one? How can I give it a custom naming scheme? Am I supposed to modify CsvItemExporter?


回答1:


Try this pipeline:

# -*- coding: utf-8 -*-

# Define your item pipelines here
#
# Don't forget to add your pipeline to the ITEM_PIPELINES setting
# See: http://doc.scrapy.org/en/latest/topics/item-pipeline.html

from scrapy.exporters import CsvItemExporter

import datetime

class MyPipeline(object):

    def __init__(self, stats):
        self.stats = stats
        self.base_filename = "result/amazon_{}.csv"
        self.next_split = self.split_limit = 50000 # assuming you want to split 50000 items/csv
        self.create_exporter()  

    @classmethod
    def from_crawler(cls, crawler):
        return cls(crawler.stats)

    def create_exporter(self):
        now = datetime.datetime.now()
        datetime_stamp = now.strftime("%Y%m%d%H%M")
        self.file = open(self.base_filename.format(datetime_stamp),'w+b')
        self.exporter = CsvItemExporter(self.file)
        self.exporter.start_exporting()       

    def process_item(self, item, spider):
        if (self.stats.get_stats()['item_scraped_count'] >= self.next_split):
            self.next_split += self.split_limit
            self.exporter.finish_exporting()
            self.file.close()
            self.create_exporter
        self.exporter.export_item(item)
        return item

Don't forget to add the pipeline to your setting:

ITEM_PIPELINES = {
   'myproject.pipelines.MyPipeline': 300,   
}



回答2:


Are you using Linux?

The split command is very useful for this case.

split -l 5000  -d --additional-suffix .csv items.csv items-

See split --help for the options.



来源:https://stackoverflow.com/questions/21009027/split-scrapys-large-csv-file

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!