spawn

Using two commands (using pipe |) with spawn

偶尔善良 提交于 2019-12-06 03:54:22
I'm converting a doc to a pdf (unoconv) in memory and printing (pdftotext) in the terminal with: unoconv -f pdf --stdout sample.doc | pdftotext -layout -enc UTF-8 - out.txt Is working. Now i want use this command with child_process.spawn : let filePath = "...", process = child_process.spawn("unoconv", [ "-f", "pdf", "--stdout", filePath, "|", "pdftotext", "-layout", "-enc", "UTF-8", "-", "-" ]); In this case, only the first command (before the |) is working. Is i possible to do what i'm trying? Thanks. UPDATE- Result of: sh -c- .... bash-3.2$ sh -c- unoconv -f pdf --stdout /Users/fatimaalves

mpi4py: close MPI Spawn?

北城以北 提交于 2019-12-06 03:24:56
I have some python code in which I very often Spawn multiple processes. I get an error: ORTE_ERROR_LOG: The system limit on number of pipes a process can open was reached in file odls_default_module.c at line 809 My code roughly looks like this import mpi4py comm = MPI.COMM_WORLD ... icomm = MPI.COMM_SELF.Spawn(sys.executable,args=["front_process.py",str(rank)],maxprocs=no_fronts) ... message = icomm.recv(source=MPI.ANY_SOURCE,tag=21) ... icomm.Free() The Spawn command is called very often and I think that they remain "open" after I am finished despite giving the icomm.Free() command. How do I

Run Ant target in background without using spawn=true

半世苍凉 提交于 2019-12-05 15:10:54
I would like to start a server in background, go back and execute some other targets, and then stop the server when Ant finishes executing all targets. I have come up with the following two solutions but they both block Ant from executing the subsequent targets. Since I want the process to die in the end, I do not want to use spawn="true". Is there any other solution? <target name="Start_Selenium_Server"> <java dir="lib" jar="lib/selenium-server-standalone-2.28.0.jar" fork="true"> <arg line="-singleWindow -userExtensions user-extensions.js"/> </java> </target> <target name="Start_Selenium

How do I spawn a separate python process?

微笑、不失礼 提交于 2019-12-05 09:38:19
I need to spawn a separate python process that runs a sub script. For example: main.py runs and prints some output to the console. It then spawns sub.py which starts a new process. Once main.py has spawned sub.py it should terminate whilst sub.py continues to run. Thank you. Edit: When I run main.py it prints 'main.py' but nothing else and sub.py doesn't launch. main.py print "main.py" import subprocess as sp process=sp.Popen('sub.py', shell=True, stdout=sp.PIPE, stderr=sp.PIPE) out, err = process.communicate(exexfile('sub.py')) # The output and error streams raw_input("Press any key to end.")

Execute shell script in gruntfile and assign result to variable

梦想的初衷 提交于 2019-12-05 04:26:29
I am using grunt to manage a suite of mocha-run tests. One of the things required in the mocha test suite is that certain environment variables be set so that the tests are executed properly based on the environment of the developer running the tests. One of these environment variables will have a different value on every developer's machine, so we execute a bash script to return that value for the environment variable we are setting. I am using grunt.util.spawn to run the script and assign its result to a variable defined in my gruntfile, and then grunt-env to set the environment variable

Errno::ENOMEM: Cannot allocate memory - cat

最后都变了- 提交于 2019-12-04 23:39:43
I have a job running on production which process xml files. xml files counts around 4k and of size 8 to 9 GB all together. After processing we get CSV files as output. I've a cat command which will merge all CSV files to a single file I'm getting: Errno::ENOMEM: Cannot allocate memory on cat (Backtick) command. Below are few details: System Memory - 4 GB Swap - 2 GB Ruby : 1.9.3p286 Files are processed using nokogiri and saxbuilder-0.0.8 . Here, there is a block of code which will process 4,000 XML files and output is saved in CSV (1 per xml) (sorry, I'm not suppose to share it b'coz of

Erlang: How to view output of io:format/2 calls in processes spawned on remote nodes

牧云@^-^@ 提交于 2019-12-04 13:24:28
问题 I am working on a decentralized Erlang application. I am currently working on a single PC and creating multiple nodes by initializing erl with the -sname flag. When I spawn a process using spawn/4 on its home node, I can see output generated by calls io:format/2 within that process in its home erl instance. When I spawn a process remotely by using spawn/4 in combination with register_name , output of io:format/2 is sometimes redirected back to the erl instance where the remote spawn/4 call

Spawn command not found

ⅰ亾dé卋堺 提交于 2019-12-04 12:15:48
I have an error trying to run a .sh file line 2: spawn: command not found ": no such file or directory bash.sh: line 3: expect: command not found bash.sh: line 4: send: command not found #!/usr/bin/expect -f spawn sftp -o IdentityFile=MyFile.ppk 500200243@XXX.XXX.XXX.XXX expect "XXX.XXX.XXX.XXX.gatewayEnter passphrase for key 'MyFile.ppk.ppk':" send "myPassword" Any idea why it happens? It works OK for me (error from sftp: ssh: Could not resolve hostname XXX.XXX.XXX.XXX: Name or service not known ), though the .sh extension for an expect ( tcl ) script is a little off-putting ;-) Often when

Handle multiple Spawn process in expect script

坚强是说给别人听的谎言 提交于 2019-12-04 06:07:10
问题 Here is my use case for expect script ( one of few i have) I want to run multiple sed command over ssh. Its like pre-build environment setup. I want to run something like this :- #!/usr/bin/expect set timeout -1 spawn -noecho bash -c "ssh -t user@host 'sed -i <some_stuff1> <file1>'" spawn -noecho bash -c "ssh -t user@host 'sed -i <some_stuff2> <file2>'" spawn -noecho bash -c "ssh -t user@host 'sed -i <some_stuff3> <file3>'" expect { -re ".*sword.*" { exp_send "$env(PASS_WORD)\n" exp_continue

NodeJS spawn stdout string format

只谈情不闲聊 提交于 2019-12-04 01:19:26
I'm spawning a process in node and tracking the output of the command like this: proc.stdout.on("data", function (data) { console.log(data.toString()); }); It works well, however, the output seems to be splitting the lines: npm http 304 https://registry.npmjs.org/underscore The above is just one line out of the response from an npm install . Typically this is all in one line, it's also adding line breaks before and after the response. Is there a way to get the data output to look like the standard run, i.e. line-by-line? Streams are buffered and emit data events whenever they please (so to