Teradata-jdbc: What's the point of using Fastload if java has memory limitations?

若如初见. 提交于 2020-01-05 19:36:53

问题


Here is the link to a sample jdbc Fastload program from the teradata website : http://developer.teradata.com/doc/connectivity/jdbc/reference/current/samp/T20205JD.java.txt

It inserts only one row so I modified it to insert 500K rows by replacing the following code :

                        pstmt.setInt(1, 1);
                        pstmt.setString(2, strBuf);
                        pstmt.addBatch();
                        batchCount++;

with :

                        for (int i = 0; i < 500000 ; i ++ ) {
                        pstmt.setInt(1, i);
                        pstmt.setString(2, strBuf);
                        pstmt.addBatch();
                        batchCount++;
                        }

It of course failed because java was out of memory.

So Fastloads jdbc fails to upload EVEN 500K rows of very simple data . . because the method addBatch() throws outOfMemory exception at some point.

But I read that Fastload was able to upload millions of rows ! ! ! However I could not find any real example anywhere . How to overcome outOfMemory java exception ?

Can anybody show an example with jdbc and Fastload (NOT FastloadCSV!) to upload let's say 1M rows ?


PS :

1) xmx increase of heap space defeats the purpose, because every additional addBatch() methods executes slower, and additional heap has limitations ( usually 4 g )

2) I do not need FastloadCSV , because it does not support text qualifiers until ttu 14 and has other issues


回答1:


You must setAutoCommit(false) and then simply executeBatch multiple times, e.g. after every 50,00 or 100,000 addBatch, before you run out of memory. Finally you commit.

See Speed up your JDBC/ODBC applications on developer.teradata.com



来源:https://stackoverflow.com/questions/26684648/teradata-jdbc-whats-the-point-of-using-fastload-if-java-has-memory-limitations

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!