stdout

Linux: Capture output of an already running process ( in pure C! )

家住魔仙堡 提交于 2019-12-04 13:25:56
问题 My situation is the following: I've got a lot of small gizmos ( pretty close to routers, not exactly but anyway that's irrelevant) ; they are running a bare-bones MIPS-based Linux distro. To control them, one can telnet there ( thru serial port ) and issue commands to an interactive bash-like shell which then writes back some output. The shell's input and output are both attached to /dev/ttyAS0. Now, I'd like to automate all of this, i.e. write a program that will run inside the gizmo, be a

Sending strings between to Python Scripts using subprocess PIPEs

雨燕双飞 提交于 2019-12-04 12:57:33
I want to open a Python script using subprocess in my main python program. I want these two programs to be able to chat with one another as they are both running so I can monitor the activity in the slave script, i.e. I need them to send strings between each other. The main program will have a function similar to this that will communicate with and monitor the slave script: Script 1 import subprocess import pickle import sys import time import os def communicate(clock_speed, channel_number, frequency): p = subprocess.Popen(['C:\\Python27\\pythonw','test.py'], stdin=subprocess.PIPE, stdout

Segmentation fault while redirecting sys.stdout to Tkinter.Text widget

为君一笑 提交于 2019-12-04 12:51:17
问题 I'm in the process of building a GUI-based application with Python/Tkinter that builds on top of the existing Python bdb module. In this application, I want to silence all stdout/stderr from the console and redirect it to my GUI. To accomplish this purpose, I've written a specialized Tkinter.Text object (code at the end of the post). The basic idea is that when something is written to sys.stdout, it shows up as a line in the "Text" with the color black. If something is written to sys.stderr,

Sharing stdout among multiple threads/processes

回眸只為那壹抹淺笑 提交于 2019-12-04 11:33:59
I have a linux program(the language doesn't matter) which prints it's log onto stdout. The log IS needed for monitoring of the process. Now I'm going to parallelize it by fork'ing or using threads. The problem: resulting stdout will contain unreadable mix of unrelated lines... And finally The question: How would you re-construct the output logic for parallel processes ? Sorry for answering myself... The definite solution was to use the GNU parallel utility. It came to replace the well known xargs utility, but runs the commands in parallel, separating the output into groups. So I just left my

node.js child_process.spawn no stdout unless 'inherit'

天涯浪子 提交于 2019-12-04 11:25:35
问题 I'm trying to capture the stdout from a spawn ed child_process in node.js (0.10.29). Right now I'm just trying with ping The following code doesn't print (but does ping) var exec = require('child_process').exec; var spawn = require('child_process').spawn; var util = require('util') var ping = spawn('ping', ['127.0.0.1'], {stdio: 'pipe'}); ping.stdout.on('data', function(data){ util.print(data); }) ping.stderr.on('data', function(data){ util.print(data); }) If I change stdio: 'pipe' to stdio:

bash redirect to /dev/stdout: Not a directory

ぐ巨炮叔叔 提交于 2019-12-04 11:19:54
I recently upgraded from CentOS 5.8 (with GNU bash 3.2.25) to CentOS 6.5 (with GNU bash 4.1.2). A command that used to work with CentOS 5.8 no longer works with CentOS 6.5. It is a silly example with an easy workaround, but I am trying to understand what is going on underneath the bash hood that is causing the different behavior. Maybe it is a new bug in bash 4.1.2 or an old bug that was fixed and the new behavior is expected? CentOS 5.8: (echo "hi" > /dev/stdout) > test.txt echo $? 0 cat test.txt hi CentOS 6.5: (echo "hi" > /dev/stdout) > test.txt -bash: /dev/stdout: Not a directory echo $? 1

How to redirect STDOUT and STDERR to a variable

北战南征 提交于 2019-12-04 11:05:31
问题 I want to redirect STDERR and STDOUT to a variable. I did this. close(STDOUT); close(STDERR); my $out; open(STDOUT, ">>", \$out); open(STDERR, ">>", \$out); for(1..10) { print "print\n"; # this is ok. warn "warn\n"; # same system("make"); # this is lost. neither in screen nor in variable. } The problem with system . I want the output of this call to be captured too. 回答1: Are you seeking to capture the output in a variable? If so, you have use backticks or qx{} with appropriate redirection.

Read from pipe line by line in C

妖精的绣舞 提交于 2019-12-04 10:57:17
问题 How can I separate the lines which are coming from a pipe. In the pipe there is this text: HALLO:500\n TEST:300\N ADAD ADAWFFA AFAGAGAEG I want to separate the lines from the pipe because i want to save the values in variables. Here is my c code: #include <stdio.h> #include <stdlib.h> #define BUFFERSIZE 1 int main(int argc, char **argv){ unsigned char buffer[BUFFERSIZE]; FILE *instream; int bytes_read=0; int buffer_size=0; buffer_size=sizeof(unsigned char)*BUFFERSIZE; /* open stdin for

Asynchronously redirect stdout/stdin from embedded python to c++?

自作多情 提交于 2019-12-04 10:38:15
问题 I am essentially trying to write a console interface with input and output for an embedded python script. Following the instructions here, I was able to capture stdout: Py_Initialize(); PyRun_SimpleString("\ class StdoutCatcher:\n\ def __init__(self):\n\ self.data = ''\n\ def write(self, stuff):\n\ self.data = self.data + stuff\n\ import sys\n\ sys.stdout = StdoutCatcher()"); PyRun_SimpleString("some script"); PyObject *sysmodule; PyObject *pystdout; PyObject *pystdoutdata; char *string;

How to avoid the deadlock in a subprocess without using communicate()

本秂侑毒 提交于 2019-12-04 10:38:10
问题 proc = subprocess.Popen(['start'],stdin=subprocess.PIPE,stdout=subprocess.PIPE) proc.stdin.write('issue commands') proc.stdin.write('issue more commands') output = proc.stdout.read() # Deadlocked here # Actually I have more commands to issue here I know that communicate() can give me a solution to this problem but I wanted to issue more commands later. But communicate() kind of closes the subprocess. Is there any know work around here for this. I am trying to interact with a router using a