bulk-load

How to import tables with missing values?

落爺英雄遲暮 提交于 2019-11-30 18:06:59
I use basketball data tables to get some understanding of Postgres 9.2 & phppgadmin. Therefore I would like to import csv tables into that database. However, I get: ERROR: missing data for column "year" CONTEXT: COPY coaches, line 1: ""coachid";"year";"yr_order";"firstname";"lastname";"season_win";"season_loss";"playoff_win";"playoff..." with command: \copy coaches FROM '/Users/Desktop/Database/NBAPostGres/DataOriginal/coaches_data.csv' DELIMITER ',' CSV; The current table has no missings. So my questions are: What did I wrong and if using a table with missing values? How to import such table

How to convert date strings to timestamp without knowing the date format

限于喜欢 提交于 2019-11-30 12:08:13
I am trying to write a query to insert a value into a timestamp with no timezone data type field. The value is coming from CSV file. The version I am working with is PostgreSQL 8.1.21 . The CSV file upload is done by the client and it has a date column. The date sometimes comes as '28-Sep-13' and sometimes as '28/09/2013' formats. I tried to use the following to cast the string into timestamp: str_date::timestamp . This works fine if str_date is something like '28-Sep-13' but it won't work if the incoming date has the format '28/09/2013' , when this error occurs: ERROR: date/time field value

App Engine Bulk Loader Performance

我们两清 提交于 2019-11-30 09:20:09
问题 I am using the App Engine Bulk loader (Python Runtime) to bulk upload entities to the data store. The data that i am uploading is stored in a proprietary format, so i have implemented by own connector (registerd it in bulkload_config.py ) to convert it to the intermediate python dictionary. import google.appengine.ext.bulkload import connector_interface class MyCustomConnector(connector_interface.ConnectorInterface): .... #Overridden method def generate_import_record(self, filename, bulkload

INSERT of 10 million queries under 10 minutes in Oracle?

别来无恙 提交于 2019-11-30 07:00:43
I am working on a file loader program. The purpose of this program is to take an input file, do some conversions on its data and then upload the data into the database of Oracle. The problem that I am facing is that I need to optimize the insertion of very large input data on Oracle. I am uploading data into the table, lets say ABC. I am using the OCI library provided by Oracle in my C++ Program. In specific, I am using OCI Connection Pool for multi-threading and loading into ORACLE. ( http://docs.oracle.com/cd/B28359_01/appdev.111/b28395/oci09adv.htm ) The following are the DDL statements

How to import tables with missing values?

半腔热情 提交于 2019-11-30 02:14:57
问题 I use basketball data tables to get some understanding of Postgres 9.2 & phppgadmin. Therefore I would like to import csv tables into that database. However, I get: ERROR: missing data for column "year" CONTEXT: COPY coaches, line 1: ""coachid";"year";"yr_order";"firstname";"lastname";"season_win";"season_loss";"playoff_win";"playoff..." with command: \copy coaches FROM '/Users/Desktop/Database/NBAPostGres/DataOriginal/coaches_data.csv' DELIMITER ',' CSV; The current table has no missings. So

In PostgreSQL, how to insert data with COPY command?

寵の児 提交于 2019-11-29 09:08:59
I have problem when run 1 project NodeJs with PostgreSQL database. I have error when trying to insert data in pgAdmin using the COPY command. COPY beer (name, tags, alcohol, brewery, id, brewery_id, image) FROM stdin; Bons Voeux blonde 9.5 Brasserie Dupont 250 130 generic.png This data in gist : This error: ERROR: syntax error at or near "Bons" SQL state: 42601 Character: 1967 COPY tbl FROM STDIN ; is not supported by pgAdmin. You get a plain syntax error because Postgres gets the data as SQL code. Four possible solutions: 1. Use a multi-row INSERT instead: INSERT INTO beer(name, tags, alcohol

Bulk load data into sqlite?

江枫思渺然 提交于 2019-11-27 11:33:58
Does anybody have any tips on utilities that can be used to bulk load data that is stored in delimited text files into an SQLite database? Ideally something that can be called as a stand-alone program from a script etc. A group I work with has an Oracle Database that's going to dump a bunch of data out to file and then load that data into an SQLite database for use on a mobile device and are looking for the easiest way to implement that sort of scenario. Check out the sqite .import command - it does exacty this. You can set the separator with the .separator command sqlite3 myDatabase create

Django bulk_create with ignore rows that cause IntegrityError?

孤街醉人 提交于 2019-11-26 20:08:12
问题 I am using bulk_create to loads thousands or rows into a postgresql DB. Unfortunately some of the rows are causing IntegrityError and stoping the bulk_create process. I was wondering if there was a way to tell django to ignore such rows and save as much of the batch as possible? 回答1: This is now possible on Django 2.2 Django 2.2 adds a new ignore_conflicts option to the bulk_create method, from the documentation: On databases that support it (all except PostgreSQL < 9.5 and Oracle), setting

Bulk load data into sqlite?

霸气de小男生 提交于 2019-11-26 18:03:00
问题 Does anybody have any tips on utilities that can be used to bulk load data that is stored in delimited text files into an SQLite database? Ideally something that can be called as a stand-alone program from a script etc. A group I work with has an Oracle Database that's going to dump a bunch of data out to file and then load that data into an SQLite database for use on a mobile device and are looking for the easiest way to implement that sort of scenario. 回答1: Check out the sqite .import