proc-open

Connect pipes of processes in php

隐身守侯 提交于 2019-12-22 12:19:57
问题 I would like the output of one process created with proc_open to be piped to another one created with proc_open (in php). For example. In bash I can do: [herbert@thdev1 ~]$ cat foo 2 3 1 [herbert@thdev1 ~]$ cat foo | sort 1 2 3 [herbert@thdev1 ~]$ I would like to simulate this in php using proc_open (instead of shell_exec) in order to have control over return-codes, pipes, etc. So I want something like this: $catPipes=array(); $sortPipes=array(); $cwd = '/tmp'; $env = array(); $catProcess =

Multiple input with proc_open()

北战南征 提交于 2019-12-20 03:43:09
问题 I'm currently working on an online program. I'm writing a php script that executes a command in the command line with proc_open() (under Linux Ubuntu). This is my code so far: <?php $cmd = "./power"; $descriptorspec = array( 0 => array("pipe", "r"), 1 => array("pipe", "w"), 2 => array("pipe", "w"), ); $process = proc_open($cmd, $descriptorspec, $pipes); if (is_resource($process)) { fwrite($pipes[0], "4"); fwrite($pipes[0], "5"); fclose($pipes[0]); while($pdf_content = fgets($pipes[1])) { echo

PHP using proc_open so that it doesn't wait for the script it opens (runs) to finish?

我的未来我决定 提交于 2019-12-12 03:50:03
问题 I've spent a while on this but can't get this to work, I apologize as I asked a somewhat related question about this earlier but deleted it so I could do more research to narrow down the question, at this point I am stuck as I thought I had the solution with this but its not working as I expect.. I found a forum where someone asked a similar question and they gave code like below, which I am trying.. it does run the script, but it still waits for it to finish before going to the next line in

Sending signals to a process opened by proc_open()

末鹿安然 提交于 2019-12-11 03:28:46
问题 We've got a utility here that's using proc_open() to call ssh to run commands on a remote machine. However, in some cases we need to halt the command on the remote machine but proc_close() and proc_terminate() do not cause the desired signal to be sent through to the far side of the ssh connection. SSH will generally issue a SIGHUP to running programs when it is terminated, but we need to send a SIGINT to ssh which will forward it through to the program running on the remote end. I've googled

Environment is not passed to process opened by proc_open

China☆狼群 提交于 2019-12-10 23:58:28
问题 I have a problem passing environment variables to processes that i opened with proc_open. I found the following example on http://us2.php.net/manual/en/function.proc-open.php <?php $descriptorspec = array( 0 => array("pipe", "r"), // stdin is a pipe that the child will read from 1 => array("pipe", "w"), // stdout is a pipe that the child will write to 2 => array("file", "/tmp/error-output.txt", "a") // stderr is a file to write to ); $cwd = '/tmp'; $env = array('some_option' => 'aeiou');

Not getting entire response from popen

守給你的承諾、 提交于 2019-12-10 16:36:20
问题 Hi I'm running a process with popen;- $handle = popen('python scriptos.py', "r"); while (!feof($handle)) { $data = fgets($handle); echo "> ".$data; } And I'm only getting 3 lines from a process that returns 5 lines. I run this exact command in CLi and I will get more response. It's as if it stops reading early (it can take time to complete and updates the next 2 lines whilst working, it's a progress indicator). Am I doing anything wrong? Is proc_open more suitable (i've started seeing if I

Get process resource by PID

两盒软妹~` 提交于 2019-12-10 11:55:20
问题 I want to write web SSH console, and i found two problems. What i want to do. First I want to execute start.php file which have following code. $process = proc_open('start', array( 0 => array("pipe", "r"), 1 => array("pipe", "w"), 2 => array("pipe", "a") ), $pipes); Second i want to run command.php file which run command on created process in start.php file and get results from it. $pid = 12345; print_r(process_command('ping google.com', $pid)); I just want to access process ( cmd ) created

PHP: Zip a file stream on the fly on an intermediate server, without storing (too much) data

谁说我不能喝 提交于 2019-12-10 10:35:07
问题 Related: On-the-fly zipping & streaming of large files, in PHP or otherwise Streaming a large file using PHP I'm looking for a combination of methods described in the other topics. I probably need to read the file (from URL) in small chunks , pipe these into the STDIN of a proc_open zip command, grab output and flush this towards the client. What I need to do: Read a file stream from an URL from a storage server Zip it on the fly on a webserver Offer it as a download to the web browser, using

Proper shell execution in PHP

柔情痞子 提交于 2019-12-09 09:44:57
问题 The problem I was using a function that made use of proc_open() to invoke shell commands. It seems the way I was doing STDIO was wrong and sometimes caused PHP or the target command to lock up. This is the original code: function execute($cmd, $stdin=null){ $proc=proc_open($cmd,array(0=>array('pipe','r'),1=>array('pipe','w'),2=>array('pipe','w')),$pipes); fwrite($pipes[0],$stdin); fclose($pipes[0]); $stdout=stream_get_contents($pipes[1]); fclose($pipes[1]); $stderr=stream_get_contents($pipes

Run perl file from PHP script but not wait for output on Windows Server

有些话、适合烂在心里 提交于 2019-12-08 18:06:30
Im trying to execute a perl script from within a php script. I have had this working using various methods such as exec, popen and proc_open but I have a couple of issues to get around which the good old Google isnt giving me the answers to. I need to run the .pl script (passing one variable to the script which is a number) from within the php file but stop the php script from waiting until the .pl has finished (the .pl is likely to take 4-5 hours to run on the server). Im not expecting any return output from the perl script (the perl script logs its output to a mysql db) so I just need it to