Using @SqlBatch
to batch update the DB
@SqlBatch(\"\")
@BatchChunkSize(INSERT_BATCH_SIZE)
void insert(@BindBean Iterator<
I found a similar issue importing from Aurora (MySQL) to Redshift using a DataPipeline CopyActivity. I was able to solve it by casting the incoming data to the proper target table types in the insertQuery like this:
INSERT INTO my_table (id, bigint_col, timestamp_col) VALUES (?,cast(? as bigint),cast(? as timestamp))