UnableToExecuteStatementException: Batch entry was aborted. Call getNextException to see the cause

后端 未结 2 1706
被撕碎了的回忆
被撕碎了的回忆 2021-01-15 18:36

Using @SqlBatch to batch update the DB

@SqlBatch(\"\")
@BatchChunkSize(INSERT_BATCH_SIZE)
void insert(@BindBean Iterator<         


        
2条回答
  •  轮回少年
    2021-01-15 19:31

    I found a similar issue importing from Aurora (MySQL) to Redshift using a DataPipeline CopyActivity. I was able to solve it by casting the incoming data to the proper target table types in the insertQuery like this:

    INSERT INTO my_table (id, bigint_col, timestamp_col) VALUES (?,cast(? as bigint),cast(? as timestamp))
    

提交回复
热议问题