stdout

Capturing output from buffered StdOut program

孤街浪徒 提交于 2019-12-24 19:10:16
问题 I'm trying to capture the output of a windows program using Qt and Python. I'm starting the process with QProcess, but the problem is the output is being buffered. Unfortunately I don't have access to the source, and therefore can't flush the output. From my searching around, I found the program "Expect", but I don't know if there is a free Windows version floating around. It would be nice to do it purely in python though. 回答1: Please take a look at QShared Memory http://doc.trolltech.com

Python multiprocessing within node.js - Prints on sub process not working

有些话、适合烂在心里 提交于 2019-12-24 18:48:54
问题 I have a node.js application that runs a client interface which exposes action that triggers machine-learn tasks. Since python is a better choice when implementing machine-learn related stuff, I've implemented a python application that runs on demand machine learning tasks. Now, I need to integrate both applications. It has been decided that we need to use a single (AWS) instance to integrate both applications. One way found to do such integration was using python-shell node module. There,

Certain Python commands aren't caught in Stdout

放肆的年华 提交于 2019-12-24 16:27:48
问题 I've written a simple program that captures and executes command line Python scripts, but there is a problem. The text passed to a Python input function isn't written to my program despite my program capturing stdout. For example: The Python script: import sys print("Hello, World!") x = input("Please enter a number: ") print(x) print("This work?") Would write "Hello, World!" then stop. When I pass it a number it would continue on writing "Please enter a number: 3". What is going on? Any

Subprocess displays sys.stdout correctly in PyCharm but not in the console

a 夏天 提交于 2019-12-24 15:01:44
问题 I have a small Python program that executes a terminal command from a pip package called commandwrapper (which is a wrapper for subprocess.popen : https://pypi.python.org/pypi/commandwrapper/0.7). I am also trying to capture the real-time output to the console and to a file. I have the code: class Tee(object): def __init__(self, *files): self.files = files def write(self, obj): for f in self.files: f.write(obj) f.flush() def flush(self) : for f in self.files: f.flush() # Set the stdout/stderr

Make osascript print stdout interactively / in real-time

荒凉一梦 提交于 2019-12-24 14:13:45
问题 Ok, so I have this very simple python script: import time import sys for i in range(25): time.sleep(1) print(i) sys.exit() When I use python to run it ( /usr/local/bin/python3.6 testscript.py ), all works fine and the output reads: 1 2 3 4 etc.. With each number printed 1 second after the other. However when I run: /usr/bin/osascript -e 'do shell script "/usr/local/bin/python3.6 testscript.py" with prompt "Sart Testing " with administrator privileges' There isn't any output for 25 seconds and

Cronjob - How to output stdout, and ignore stderr

烂漫一生 提交于 2019-12-24 13:50:34
问题 Is it possible to output stdout to file, but ignore stderr ? I have a Python script that uses sys.stderr.write(error) to output errors to stderr . I'd like to ignore these for this particular script. How is this possible? Here is the current entry in the crontab: * * * * * /Users/me/bin/scrape-headlines /Users/me/proxies.txt >> /Users/me/headlines.txt 2>&1 scrape-headlines is a bash script that calls the Python script. 回答1: The 2>&1 redirects stderr to stdout , appending it to the headlines

A bit curious about `mysql -e` output format with & without redirect

时光毁灭记忆、已成空白 提交于 2019-12-24 12:51:56
问题 Say when I run mysql -u user -p -e 'select id from db.users limit 1' , I got: +------+ | id | +------+ | 8434 | +------+ When I redirect the output/stdout to some file, like mysql -u user -p -e 'select id from db.users limit 1' > /tmp/a.txt , then I cat /tmp/a.txt , I got: id 8434 So where do those little format strings go? Does it mean that mysql knows when it is redirected, so it returns a different format? I always thought a redirect( > ) doesn't concern the previous command, that it doesn

Bash script to read from /dev/urandom

人盡茶涼 提交于 2019-12-24 08:38:53
问题 I want to make bash script that sends to stdout a file image containing only ASCII writable characters. My script should receive and accept only one argument containing the number of octets that should be read from /dev/urandom . So, I need to read a given number of octets from /dev/urandom to create a file image to send to stdout. I have this: !/usr/bin/env bash X=1 if [ $X -eq 0 ]; then echo "Error: An argument is needed" else strings /dev/urandom echo the result fi I'm checking if there's

Run java.exe through ProcessBuilder leads to application hanging?

不羁岁月 提交于 2019-12-24 07:16:39
问题 I have the following code: ProcessBuilder pb = new ProcessBuilder("C:\\Program Files\\Java\\jdk1.8.0_111\\bin\\java", "-cp", "project_folder\\target\\classes", "package.ExternalProcess"); Process p = pb.start(); OutputStream processOutputStream = p.getOutputStream(); IOUtils.write("1" + System.lineSeparator(), processOutputStream); InputStream processInputStream = p.getInputStream(); System.out.println("--1--"); System.out.println(process.isAlive()); // outputs true String result = IOUtils

Check for stdout or stderr

时光总嘲笑我的痴心妄想 提交于 2019-12-24 05:02:55
问题 One of the binaries which I am using in my shell script is causing a segmentation fault (RETURN VALUE: 139) And even though, I am redirecting both stdout and stderr to a logfile, the Segmentation Fault error messages is displayed in the terminal when I am running the shell script. Is it possible to redirect this message from Segfault to a logfile ?? 回答1: The Segmentation Fault message you see is printed by the shell that is running your program. This behavior varies from shell to shell, so a