How can improve csv reader & writer script performance?

空扰寡人 提交于 2020-07-10 08:46:12

问题


My script that reads image links from an input csv, and performs quality checks on each of those images. It will then append the data from the checks to a new csv. Obviously, doing several checks on each line will be a limiting factor, but I am trying to determine a way to speed up the processing. I thought about reading in multiple lines at one time, but would that help? What would be some suggested ways to speed this script up?

# runs each check as needed
def add_to_row(row, line_num):

    # my processing checks would go here

    #appending score and check results
    for idx, i in enumerate(results):
        row.append(i)

# deals with csv file
def add_column_in_csv(input_file, output_file, transform_row):
    # open input file and create output file
    with open(input_file, 'r') as read_obj, \
        open(output_file, 'wb') as write_obj:
        # create a csv.reader object from the input file object
        csv_reader = reader(read_obj)
        # create a csv.writer object from the output file object
        csv_writer = writer(write_obj)
        # read each row of the input csv file as list
        for row in csv_reader:
            # append the headers and values from checks in add_to_row
            transform_row(row, csv_reader.line_num)
            # add the updated row to the output file
            csv_writer.writerow(row)

# let them know it is doing something
print('Analyzing input file . . .')
# actually add the data to the new csv here
add_column_in_csv(input_file, output_file, add_to_row)
# it is done
print('Done analyzing file. Output created: ' + str(output_file))

来源:https://stackoverflow.com/questions/61898863/how-can-improve-csv-reader-writer-script-performance

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!