pyodbc

How to speed up bulk insert to MS SQL Server from CSV using pyodbc

∥☆過路亽.° 提交于 2019-11-26 03:10:38
问题 Below is my code that I\'d like some help with. I am having to run it over 1,300,000 rows meaning it takes up to 40 minutes to insert ~300,000 rows. I figure bulk insert is the route to go to speed it up? Or is it because I\'m iterating over the rows via for data in reader: portion? #Opens the prepped csv file with open (os.path.join(newpath,outfile), \'r\') as f: #hooks csv reader to file reader = csv.reader(f) #pulls out the columns (which match the SQL table) columns = next(reader) #trims

Working with an Access database in Python on non-Windows platform (Linux or Mac)

独自空忆成欢 提交于 2019-11-26 00:56:08
问题 I want to access the data in a Microsoft Access database. I have some .accdb and .mdb files and want to read them in Python. From my research, pyodbc can only be used on Windows platform, but I am working on Mac OS X. I am new to Python. The other option is if I could export the data from the database to a csv and then use in python. Any help or starting would be highly appreciated. 回答1: "From my research, pyodbc can only be used on Windows platform" Not true. The main pyodbc page says