bulk-load

Java program to pass List of Bean to a oracle stored procedure - Pass entire list at one shot rather than appending objects one after the other

倾然丶 夕夏残阳落幕 提交于 2019-12-23 04:47:12
问题 I have the following set of TYPE, PROCEDURE and Java code. I am able to call the stored procedure successfully but i have to append the objects one after the other. I want the process to be happening at one shot as i am dealing with over 50K+ records. Can anyone please let me know what changes needs to be done so that i can send the entire list at one. The code can be found below. TYPES: CREATE OR REPLACE TYPE CER_GL_ENTRY_TYPE AS OBJECT (idKey NUMBER(10) ); CREATE or REPLACE TYPE CER_GL

Bulk loading into PostgreSQL from a remote client

旧城冷巷雨未停 提交于 2019-12-23 02:57:05
问题 I need to bulk load a large file into PostgreSQL. I would normally use the COPY command, but this file needs to be loaded from a remote client machine. With MSSQL, I can install the local tools and use bcp.exe on the client to connect to the server. Is there an equivalent way for PostgreSQL? If not, what is the recommended way of loading a large file from a client machine if I cannot copy the file to the server first? Thanks. 回答1: COPY command is supported in PostgreSQL Protocol v3.0

Changing time zone value of data

烈酒焚心 提交于 2019-12-21 08:49:15
问题 I have to import data without time zone information in it (however, I know the specific time zone of the data I want to import), but I need the timestamp with time zone format in the database. Once I import it and set the timestamp data type to timestamp with time zone , Postgres will automatically assume that the data in the table is from my time zone and assign my time zone to it. Unfortunately the data I want to import is not from my time frame, so this does not work. The database also

Cannot bulk load because the file could not be opened. Operating System Error Code 3

故事扮演 提交于 2019-12-20 10:18:12
问题 I'm trying to set up a Stored Procedure as a SQL Server Agent Job and it's giving me the following error, Cannot bulk load because the file "P:\file.csv" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 15105). [SQLSTATE 42000] (Error 4861) Funny thing is the Stored Procedure works just fine when I execute it manually. The drive P: is a shared drive on Windows SQL Server from LINUX via Samba Share and it was set up by executing the following

How to convert date strings to timestamp without knowing the date format

拜拜、爱过 提交于 2019-12-18 14:53:38
问题 I am trying to write a query to insert a value into a timestamp with no timezone data type field. The value is coming from CSV file. The version I am working with is PostgreSQL 8.1.21 . The CSV file upload is done by the client and it has a date column. The date sometimes comes as '28-Sep-13' and sometimes as '28/09/2013' formats. I tried to use the following to cast the string into timestamp: str_date::timestamp . This works fine if str_date is something like '28-Sep-13' but it won't work if

Gather stats on an Index or drop create?

六月ゝ 毕业季﹏ 提交于 2019-12-13 14:02:53
问题 Does dropping and recreating an index have the same effect as to using dbms.gather_index_stats? (Does it have the same effect as rebuilding/updating the index) Or are these two completely different things that should not be compared to one another? 回答1: The difference is, gathering statistics refreshes the metadata about the current index whereas dropping and re-creating the index is, er, dropping and re-creating the index. Perhaps it is easy to understand the difference with a worked example

Cannot create Active X Object

一世执手 提交于 2019-12-13 04:48:14
问题 I'm running into an error and I cannot figure out what's wrong with the code. It happens when I try to create an object (objbl = CreateObject("SQLXMLBulkLoad.SQLXMLBulkload.4.0")). Am I missing anything? Try objbl = CreateObject("SQLXMLBulkLoad.SQLXMLBulkload.4.0") // error happens on this line. objbl.ConnectionString = ReadVariables("ConnectionString") Console.WriteLine(objbl.connectionstring.ToString) objbl.ErrorLogFile = workingdirectory & "\error.log" objbl.TempFilePath = workingdirectory

Fragmented XML Bulk Load to SQL Server in C#

不想你离开。 提交于 2019-12-13 04:23:42
问题 I have an XML that contain information resulted from scanning systems on different domains. The XML corresponds to the tables in database that are nested as follows: Domains Computers Volumes Folders Files My goal is to load the XML into the corresponding tables. Since one single XML file would be so large to load into database, I have to chunk it into smaller one. How can I format the XMLs so the uploader knows one file is a continue of the last file and it does not generate additional keys

Should I use SqlBulkCopy or Stored Procedure to import data

一个人想着一个人 提交于 2019-12-12 02:27:30
问题 I've got a log file which is 248 MB and it can extend up to a GB. So you can imagine how many rows there can be.I need to import all the rows into a table in an SQL Server database. For that I first create a DataTable and add all the lines in the log file into that DataTable as new rows. This happens pretty fast. More than a million records get added to the table in about 30 seconds. After the table is filled with the lines I then import the records in the DataTable to the database using

Large writes cause instability in Cassandra ring

Deadly 提交于 2019-12-11 16:23:34
问题 I'm attempting to load a large amount of data into a 10-node Cassandra ring. The script doing the inserts gets ~4000 inserts / s, blocked presumably on network I/O. I launch 8 of these on a single machine, and the throughput scales almost linearly. (The individual throughput goes down slightly, but is more than compensated for by the additional processes.) This works decently, however, I'm still not getting enough throughput, so I launched the same setup on 3 more VMs. (Thus, 8 processes * 4