batch-processing

Batch Limitation - Maximum Recursion while browsing menus

岁酱吖の 提交于 2019-12-04 15:51:54
I have come across a big problem while testing my code. And this problem is the annoying message ... "Maximum recursion depth exceeded". Look how it works: @echo off REM ---------- MAIN MENU ---------- :Main echo 1.Supermarket echo 2.Exit Batch setlocal for /f %%A in ('"prompt $H & echo on & for %%B in (1) do rem"') do set "BS=%%A" set /p Menu=%BS% Type in your option {1,2} followed by ENTER: if not '%Menu%'=='' set Menu=%Menu:~0,1% if %Menu% EQU 1 ENDLOCAL & goto Super if %Menu% EQU 2 goto EOF REM ---------- SECONDARY MENU ---------- :Super echo 1.Supermarket echo a.Apple echo b.Banana

Recursively find and replace files

时光怂恿深爱的人放手 提交于 2019-12-04 14:39:55
问题 What I want to do is following. I want to create some bat file, that will recursively search for files starting from current directory and replace with the file that I provided. For ex. if I want to search and replace test1.txt, I'm opening this mini app and writing text1.txt, and placing the file that I want to be replaced with. Dir app.bat test1.txt // app will recursively search inside folder 1 and folder 2 and will replace all found results with test1.txt folder 1 folder 2 I wonder, if

Identify running instances of a batch file

*爱你&永不变心* 提交于 2019-12-04 13:47:51
问题 These are not working for me. Any help to definitelly corret the four examples below ? The EXAMPLE01 just echoes "continue", even if I have three CMD.exe opened. ---------- EXAMPLE 01 ------------ @echo off wmic process where name="cmd.exe" | find "cmd.exe" /c SET ERRORLEVEL=value if "%value%" GTR 1 ( ECHO This batch is not first ECHO quitting ... ) if "%value%" LSS 2 ECHO continue I am getting the "unexpected i error" message in the EXAMPLE 02! ----------- EXAMPLE 02 ------- @echo off FOR /F

Is it possible to batch convert csv to xls using a macro?

点点圈 提交于 2019-12-04 13:43:59
I have a large amount of csv files that I need in .xls format. Is it possible to run a batch conversion with a macro or best done with another language? I have used this code http://www.ozgrid.com/forum/showthread.php?t=71409&p=369573#post369573 to reference my directory but I'm not sure of the command to open each file and save them. Here's what I have: Sub batchconvertcsvxls() Dim wb As Workbook Dim CSVCount As Integer Dim myVar As String myVar = FileList("C:\Documents and Settings\alistairw\My Documents\csvxlstest") For i = LBound(myVar) To UBound(myVar) With wb Application.Workbooks

HBase write: which one better on performance, batch or put(List<Put>)?

爱⌒轻易说出口 提交于 2019-12-04 12:04:58
I am starting to learn HBase to write data streams. I use HTableInterface and having problem in performance. It took much times to insert only 500 rows, almost 500,000ms per batch List that I inserted. Any example or suggestion for batch write into HTable with HTableInterface ? I am using HBase 0.94 Thanks They're essentially the same: batch(List<? extends Row> actions, Object[] results) allows not only puts but also gets, deletes, increments... put(List<Put> puts) just do a batch of puts (it also validates them client-side). You can also perform batches by disabling table.setAutoFlush(false)

executeBatch behaviour in case of partial failure

冷暖自知 提交于 2019-12-04 10:10:22
I have a java 1.6 application which use batch insert for inserting records in Oracle db using jdbc driver. As you know on Statement object there is a method called executeBatch() which we use for batch updates. It has a return type of int array which has result of execution for each record in it. But it also throws BatchUpdateException in case of error and we can get result int array from that too. My question is in what error situations I should expect BatchUpdateException and when I should expect there is no exception thrown but for some records I get failure. Note: Question is spesifically

Doctrine EntityManager clear method in nested entities

百般思念 提交于 2019-12-04 10:06:47
I would like to use doctrine batch insert processing in order to optimize insert of a big amount of entities. The problem is with Clear method. It says that this method detach all entities that are managed by EntityManager. So what should I do in a situation where I have a parent Entity, which has many child, and each child has their childs, like this: riseSession track points So I have 1 rideSession, 3 tracks and each track has for isntance 2 000 points. I could use batch processing in last loop which is responsible for saving points. But if I use clear method, then how to set parents for

Creating TfRecords from a list of strings and feeding a Graph in tensorflow after decoding

与世无争的帅哥 提交于 2019-12-04 09:49:50
问题 The aim was to create a database of TfRecords. Given: I have 23 folders each contain 7500 image, and 23 text file, each with 7500 line describing features for the 7500 images in separate folders. I created the database through this code: import tensorflow as tf import numpy as np from PIL import Image def _Float_feature(value): return tf.train.Feature(float_list=tf.train.FloatList(value=[value])) def _bytes_feature(value): return tf.train.Feature(bytes_list=tf.train.BytesList(value=[value]))

How to configure fault tolerance programmatically for a spring tasklet (not a chunk)

杀马特。学长 韩版系。学妹 提交于 2019-12-04 07:12:54
Programmatically configuring fault tolerance for a chunk works kind of as follows: stepBuilders.get("step") .<Partner,Partner>chunk(1) .reader(reader()) .processor(processor()) .writer(writer()) .listener(logProcessListener()) .faultTolerant() .skipLimit(10) .skip(UnknownGenderException.class) .listener(logSkipListener()) .build(); The trick is, that with adding "chunk", the chain switches to a SimpleStepBuilder which offers the "faultTolerant" method. My question is how to do that if you just have a tasklet (no reader, processor, writer)? Defining a tasklet works as follows: stepBuilders.get(

Limit the lifetime of a batch job

流过昼夜 提交于 2019-12-04 06:37:14
问题 Is there a way to limit the lifetime of a running spring-batch job to e.g. 23 hours? We start a batch job daily by a cron job and he job takes about 9 hours. It happened under some circumstances that the DB connection was so slow that the job took over 60 hours to complete. The problem is that the next job instance gets started by the cronjob the next day - and then anotherone the day after - and anotherone... If this job is not finished within e.g. 23 hours, I want to terminate it and return