batch-processing

Error handling in Mule Salesforce Batch

最后都变了- 提交于 2019-12-04 06:01:00
问题 I am trying to load a set of Accounts into Salesforce from a CSV file. I have configured the usual Datamapper, Upsert SFDC Step with Batch Commit and a Batch Step that handles only failures (logs for now). My OnComplete has a simple Logger. I have intentionally created CSV with bad data. I have an external ID in the CSV. My requirement is to process failed records differently based on the failure status. If it failed due to Bad Data, I would like to stop processing the record. If it failed

timeout or close when process is finished

柔情痞子 提交于 2019-12-04 05:20:17
问题 I have a X.exe program that takes about 2-6 hours to finish. Exact time is unknown, but I'd like to implement a threshold of 6.5 or 7 hours. If this program does not return any value by this amount of time, it will be killed. How do I implement this using batch *.bat files? Here is what I had so far: a timer bat1.bat and an actual bat2.bat . bat1.bat: start cmd /C bat2.bat & timeout /t 25200 & taskkill /im X.exe /f bat2.bat: cd blah bat1.bat The problem with this approach is that only after

Parsing the output of wmic in shell script

瘦欲@ 提交于 2019-12-04 05:10:48
问题 I am trying to parse the output of WMIC and then to get the PID. My script is as following: @echo off setLocal enableExtensions enableDelayedExpansion FOR /F "tokens=1* delims=" %%A IN ('"wmic process where(name="java.exe") get ProcessID,commandline| FINDSTR /v "CommandLine" | FINDSTR "TestServer""') DO ( set "line=%%A" @REM echo "%%A" for /F "tokens=* delims= " %%C in ("%%A") do ( echo "%%C" echo "%%D" ) ) The output is as follows: "java com.test.TestServer 7560 " "%D" "java com.test

Batch insert using groovy Sql?

寵の児 提交于 2019-12-04 04:39:50
How can you do a batch insert using groovy Sql while simulating prepared statements? All the examples I've found are similar to the following and don't use prepared statements. withBatch { stmt -> stmt.addBatch("insert into table (field1,field2) values('value1','value2')") stmt.addBatch("insert into table (field1,field2) values('value3','value4')") } According to this link http://jira.codehaus.org/browse/GROOVY-3504 there is no way to use prepared statements directly from within batch. What is the best way to simulate this so I can avoid having to write my own code to avoid sql injection?

How can I read the last 2 lines of a file in batch script

不想你离开。 提交于 2019-12-04 03:50:10
问题 I have a Java program that appends new builds information in the last two lines of a file. How can I read them in batch file? 回答1: This code segment do the trick... for /F "delims=" %%a in (someFile.txt) do ( set "lastButOne=!lastLine!" set "lastLine=%%a" ) echo %lastButOne% echo %lastLine% EDIT : Complete TAIL.BAT added This method may be modified in order to get a larger number of lines, that may be specified by a parameter. The file below is tail.bat : @echo off setlocal

JSR 352 with Liberty Profile - how to implement checkpointing when ItemReader does a DB query

会有一股神秘感。 提交于 2019-12-04 03:47:44
问题 I have 10 records in my source table and I am having item count as 3. I have 2 partitions to process these 10 records(i.e first 5 records will be processed in first partition and remaining records processed in 2nd partition while processing records in 2nd partition I am throwing an exception so job will be failed at 2nd chunk of 2nd partition.when I am restarting the job ,failed partition is processing all the records again(that is first chunk and 2nd chunk). Restarting the job should only

Batched FFTs using cufftPlanMany

久未见 提交于 2019-12-03 21:45:37
I want to perform 441 2D, 32-by-32 FFTs using the batched method provided by the cuFFT library. The parameters of the transform are the following: int n[2] = {32,32}; int inembed[] = {32,32}; int onembed[] = {32,32/2+1}; cufftPlanMany(&plan,2,n,inembed,1,32*32,onembed,1,32*(32/2+1),CUFFT_D2Z,441); cufftPlanMany(&inverse_plan,2,n,onembed,1,32*32,inembed,1,32*32,CUFFT_Z2D,441); After I did the forward and inverse FFTs using the above plans, I could not get the original data back. Can anyone advise me how to set the parameters correctly for cudaPlanMany? Many thanks in advance. By the way, is it

Spring Batch:How to monitor currently running jobs & show progress on jsp page

半城伤御伤魂 提交于 2019-12-03 20:41:38
I want to know how to monitor status of my currently running batch jobs.My jobs are basically processing folder with some default steps so I want to show progress to the user step by step .I am using Tasklets and DB Job Repository .Explaining with some example code for achieving this will be more helpful. Thank you. If you want to develop your own monitor app/webpage, you may want to look into the JobExplorer or JobOperator interface. It provides you with methods to get the JobExecutions, and in JobExecutions, you can get StepExecutions. All these give you the status of the job and individual

Use slurm job id

我是研究僧i 提交于 2019-12-03 18:39:53
问题 When I launch a computation on the cluster, I usually have a separate program doing the post-processing at the end : sbatch simulation sbatch --dependency=afterok:JOBIDHERE postprocessing I want to avoid mistyping and automatically have the good job id inserted. Any idea? Thanks 回答1: You can do something like this: RES=$(sbatch simulation) && sbatch --dependency=afterok:${RES##* } postprocessing The RES variable will hold the result of the sbatch command, something like Submitted batch job

Implementing a multi-input model in Keras, each with a different sample sizes each (different batch sizes each)

心已入冬 提交于 2019-12-03 16:15:50
I am currently trying to implement a multi-input model in Keras. The input consists of multiple batches, and each includes different samples, but I get a 'different samples'-error. My implementation looks like this: The model site looks as follows: for s in range(NUM_STREAMS): inp.append(Input(shape=(16,8))) ... The site where the error occurs: history = model.train_on_batch( x=[x for x in X_batch], y=[y for y in y_batch] ) The error I get is: ValueError: All input arrays (x) should have the same number of samples. Got array shapes: [(6, 16, 8), (7, 16, 8), (6, 16, 8), (6, 16, 8)] The abstract